can vllm launch this model?

#2
by chopin1998 - opened

current, it said..

ERROR 12-16 09:49:55 engine.py:366] raise ValueError(f"No supported config format found in {model}")
ERROR 12-16 09:49:55 engine.py:366] ValueError: No supported config format found in unsloth/Llama-3.3-70B-Instruct-GGUF

vllm version is 0.6.4.post1
transformers 4.47.0

Unsloth AI org

current, it said..

ERROR 12-16 09:49:55 engine.py:366] raise ValueError(f"No supported config format found in {model}")
ERROR 12-16 09:49:55 engine.py:366] ValueError: No supported config format found in unsloth/Llama-3.3-70B-Instruct-GGUF

vllm version is 0.6.4.post1
transformers 4.47.0

You need a config.json file. Copy the original config.json 16bit file from Github and it shouldnt work. I wouldnt recommend using vllm for GGUF, instead use llama.cpp

Hi. Where is this file link exactly?
Thanks.

Sign up or log in to comment