Text Generation
Transformers
English
llama
Inference Endpoints

Something wrong with bin files?

#18
by mancub - opened

I tried loading the bins just for testing (I use safetensors and GPU otherwise) but the models won't load and produce an error in oobabooga/text-generation-webui:

Traceback (most recent call last):
File "/home/user/Envs/text-generation-webui_env/text-generation-webui/server.py", line 914, in
shared.model, shared.tokenizer = load_model(shared.model_name)
File "/home/user/Envs/text-generation-webui_env/text-generation-webui/modules/models.py", line 71, in load_model
shared.model_type = find_model_type(model_name)
File "/home/user/Envs/text-generation-webui_env/text-generation-webui/modules/models.py", line 59, in find_model_type
config = AutoConfig.from_pretrained(Path(f'{shared.args.model_dir}/{model_name}'))
File "/home/user/Envs/text-generation-webui_env/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 916, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/user/Envs/text-generation-webui_env/lib/python3.10/site-packages/transformers/configuration_utils.py", line 573, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/user/Envs/text-generation-webui_env/lib/python3.10/site-packages/transformers/configuration_utils.py", line 661, in _get_config_dict
raise EnvironmentError(
OSError: It looks like the config file at 'models/vicuna-13b-free-V4.3-q4_0.bin' is not a valid JSON file.

An older ggml vicuna model I got elsewhere loads without issues.

All of my code and modules are up to date.

With ooba I think you need to have "ggml-" in front of the bin filename (according to https://github.com/oobabooga/text-generation-webui/blob/main/docs/llama.cpp-models.md). So you could rename the bin files to ggml-vicuna-13b-free-V4.3-q4_0.bin etc. Could that solve the issue?

That was it, thanks !

I'll try to RTFM more closely in the future :)

mancub changed discussion status to closed

Sign up or log in to comment