unable to use custom inference endpoint
#9
by
HonestAnnie
- opened
"The checkpoint you are trying to load has model type gemma2 but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date."
Perhaps a requirements.txt with a specified transformers version could fix it?
transformers>=4.44.0
Reference: https://huggingface.co./docs/inference-endpoints/guides/custom_dependencies
I've created a PR: https://huggingface.co./BAAI/bge-multilingual-gemma2/discussions/11