Could not find GemmaForCausalLM neither in <module 'transformers.models.gemma'

#44
by chenwei1984 - opened

[Could not find GemmaForCausalLM neither in <module 'transformers.models.gemma'](Name: transformers
Version: 4.38.1
Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Home-page: https://github.com/huggingface/transformers
Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors)
Author-email: [email protected]
License: Apache 2.0 License
Location: C:\Users\Administrator\AppData\Local\NVIDIA\MiniConda\envs\gcn\Lib\site-packages
Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm)

I have already used all the specified versions, but it still doesn't work
Why doesn't it work no matter how I update it? Is everyone else experiencing the same issue?

Hi @chenwei1984
Thanks for the issue - there might be some weird conflict in your environment. Can you try to re-install transformers in a fresh new env?

i sure is new

Google org

Hi @chenwei1984 , Could you please provide more details on the issue? Like what are all the steps you have tried along with reproducible code to replicate the error to better understand the issues. Thank you

Google org

I'm having the same issue (ricc@ - go/ricc-AIWC-C1-S1-fl ) .

input from https://huggingface.co./google/gemma-2-9b:

# pip install accelerate
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch

tokenizer = AutoTokenizer.from_pretrained("google/gemma-2-9b")
model = AutoModelForCausalLM.from_pretrained(
    "google/gemma-2-9b",
    device_map="auto",
)

input_text = "Write me a poem about Machine Learning."
input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")

outputs = model.generate(**input_ids, max_new_tokens=32)
print(tokenizer.decode(outputs[0]))

output:

Traceback (most recent call last):
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 701, in getattribute_from_module
    return getattribute_from_module(transformers_module, attr)
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 705, in getattribute_from_module
    raise ValueError(f"Could not find {attr} in {transformers_module}!")
ValueError: Could not find Gemma2ForCausalLM in <module 'transformers' from '/home/ricc/.venv/lib/python3.8/site-packages/transformers/__init__.py'>!

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1603, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 848, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/generation/utils.py", line 119, in <module>
    from accelerate.hooks import AlignDevicesHook, add_hook_to_module
  File "/home/ricc/accelerate.py", line 7, in <module>
    model = AutoModelForCausalLM.from_pretrained(
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 563, in from_pretrained
    model_class = _get_model_class(config, cls._model_mapping)
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 384, in _get_model_class
    supported_models = model_mapping[type(config)]
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 735, in __getitem__
    return self._load_attr_from_module(model_type, model_name)
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 749, in _load_attr_from_module
    return getattribute_from_module(self._modules[module_name], attr)
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 703, in getattribute_from_module
    raise ValueError(f"Could not find {attr} neither in {module} nor in {transformers_module}!")
ValueError: Could not find Gemma2ForCausalLM neither in <module 'transformers.models.gemma2' from '/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/gemma2/__init__.py'> nor in <module 'transformers' from '/home/ricc/.venv/lib/python3.8/site-packages/transformers/__init__.py'>!

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1603, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 848, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/gemma2/modeling_gemma2.py", line 37, in <module>
    from ...modeling_utils import PreTrainedModel
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/modeling_utils.py", line 46, in <module>
    from .generation import GenerationConfig, GenerationMixin
  File "<frozen importlib._bootstrap>", line 1039, in _handle_fromlist
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1593, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1605, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
Could not find Gemma2ForCausalLM neither in <module 'transformers.models.gemma2' from '/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/gemma
2/__init__.py'> nor in <module 'transformers' from '/home/ricc/.venv/lib/python3.8/site-packages/transformers/__init__.py'>!

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "accelerate.py", line 7, in <module>
    model = AutoModelForCausalLM.from_pretrained(
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 563, in from_pretrained
    model_class = _get_model_class(config, cls._model_mapping)
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 384, in _get_model_class
    supported_models = model_mapping[type(config)]
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 735, in __getitem__
    return self._load_attr_from_module(model_type, model_name)
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 749, in _load_attr_from_module
    return getattribute_from_module(self._modules[module_name], attr)
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 693, in getattribute_from_module
    if hasattr(module, attr):
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1593, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/ricc/.venv/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1605, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.gemma2.modeling_gemma2 because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
Could not find Gemma2ForCausalLM neither in <module 'transformers.models.gemma2' from '/home/ricc/.venv/lib/python3.8/site-packages/transformers/models/gemma
2/__init__.py'> nor in <module 'transformers' from '/home/ricc/.venv/lib/python3.8/site-packages/transformers/__init__.py'>!
Google org

@palladius , It seems, there is version incompatibility issue between the installed libraries in your system to run this Gemma model. You are using older python version 3.8 to run the Gemma model. Please try running the model again after installing the Python version 3.10 and let us know if the issue still persists.

Sign up or log in to comment