Version Crash for Qwen2 from Transformers
#1
by
BK-Lee
- opened
Thanks for great work!
I would like to suggest removing "configuration_qwen2.py" and "modeling_qwen2.py" and just replacing them with "from transformers.models.qwen2.configuration_qwen2 import Qwen2Config" and "from transformers import Qwen2ForCausalLM"
It must be leading Transformer version error as I was also facing them, but I resolved it when I remove their files and manually modify the code like below
From "configuration_eagle_chat.py" Line 13
from transformers.models.qwen2.configuration_qwen2 import Qwen2Config
From "modeling_eagle_chat.py" Line 51-52
# assert version_cmp(transformers.__version__, '4.37.2', 'ge')
# assert version_cmp(transformers.__version__, '4.39.2', 'le')
From "modeling_eagle_chat.py" Line 23
from transformers import Qwen2ForCausalLM
From "modeling_eagle_chat.py" Line 441-449 [We should remove return_dict argument]
outputs = self.language_model.generate(
inputs_embeds=input_embeds,
attention_mask=attention_mask,
generation_config=generation_config,
output_hidden_states=output_hidden_states,
use_cache=True,
**generate_kwargs,
)