Error where loading with AutoModel.from_pretrained

#38
by Hayatoyoyo - opened

Hello, I am trying to run :

from transformers import AutoModel

model = AutoModel.from_pretrained(
    'openbmb/MiniCPM-o-2_6',
    trust_remote_code=True,
    attn_implementation='sdpa', # sdpa or flash_attention_2
    torch_dtype=torch.bfloat16,
    init_vision=True,
    init_audio=False,
    init_tts=False
)

with

torch                     2.6.0+cu126
torchaudio                2.6.0+cu126
torchvision               0.21.0+cu126
transformers              4.49.0

But I got the error :

File ~\.cache\huggingface\modules\transformers_modules\openbmb\MiniCPM-o-2_6\9a8db9d033b8e61fa1f1a9f387895237c3de98a2\modeling_minicpmo.py:45
     43 from transformers import LlamaConfig
     44 from transformers import LlamaModel
---> 45 from transformers import LogitsWarper
     46 from transformers import PreTrainedModel
     47 from transformers import Qwen2ForCausalLM

ImportError: cannot import name 'LogitsWarper' from 'transformers' (c:\Users\yoyom\Documents\jupyter_lab\my_env\Lib\site-packages\transformers\__init__.py)

I have seen a thread about its deprecation here : https://github.com/oobabooga/text-generation-webui/issues/6703

Sign up or log in to comment