runtime error

Exit code: 1. Reason: Ładowanie modelu... /usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. warnings.warn( Traceback (most recent call last): File "/home/user/app/app.py", line 41, in <module> model, tokenizer = load_model() File "/home/user/app/app.py", line 16, in load_model tokenizer = AutoTokenizer.from_pretrained(model_name) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 786, in from_pretrained return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2008, in from_pretrained raise EnvironmentError( OSError: Can't load tokenizer for 'TheBloke/Llama-2-13B-chat-GGUF'. If you were trying to load it from 'https://huggingface.co./models', make sure you don't have a local directory with the same name. Otherwise, make sure 'TheBloke/Llama-2-13B-chat-GGUF' is the correct path to a directory containing all relevant files for a LlamaTokenizerFast tokenizer.

Container logs:

Fetching error logs...