runtime error
Space failed. Exit code: 1. Reason: n(*args, **kwargs) File "/home/user/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1631, in get_hf_file_metadata r = _request_wrapper( File "/home/user/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper response = _request_wrapper( File "/home/user/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 409, in _request_wrapper hf_raise_for_status(response) File "/home/user/.local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 302, in hf_raise_for_status raise GatedRepoError(message, response) from e huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-6647c23c-1548ce2e3b60ec00168d29d3;3eef6069-9ccd-4f5c-8e47-cb219ed3e144) Cannot access gated repo for url https://huggingface.co./mistralai/Mistral-7B-v0.1/resolve/main/tokenizer_config.json. Access to model mistralai/Mistral-7B-v0.1 is restricted. You must be authenticated to access it. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/user/app/app.py", line 5, in <module> tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-v0.1") File "/home/user/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 737, in from_pretrained tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs) File "/home/user/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 569, in get_tokenizer_config resolved_config_file = cached_file( File "/home/user/.local/lib/python3.10/site-packages/transformers/utils/hub.py", line 404, in cached_file raise EnvironmentError( OSError: You are trying to access a gated repo. Make sure to request access at https://huggingface.co./mistralai/Mistral-7B-v0.1 and pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`.
Container logs:
Fetching error logs...