runtime error

Exit code: 1. Reason: processor_config.json: 0%| | 0.00/162 [00:00<?, ?B/s] processor_config.json: 100%|██████████| 162/162 [00:00<00:00, 985kB/s] chat_template.json: 0%| | 0.00/1.63k [00:00<?, ?B/s] chat_template.json: 100%|██████████| 1.63k/1.63k [00:00<00:00, 11.3MB/s] preprocessor_config.json: 0%| | 0.00/483 [00:00<?, ?B/s] preprocessor_config.json: 100%|██████████| 483/483 [00:00<00:00, 3.77MB/s] tokenizer_config.json: 0%| | 0.00/177k [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 177k/177k [00:00<00:00, 45.4MB/s] tokenizer.json: 0%| | 0.00/9.26M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 9.26M/9.26M [00:00<00:00, 113MB/s] special_tokens_map.json: 0%| | 0.00/414 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 414/414 [00:00<00:00, 3.10MB/s] config.json: 0%| | 0.00/1.14k [00:00<?, ?B/s] config.json: 100%|██████████| 1.14k/1.14k [00:00<00:00, 6.21MB/s] `rope_scaling`'s original_max_position_embeddings field must be less than max_position_embeddings, got 8192 and max_position_embeddings=2048 model.safetensors.index.json: 0%| | 0.00/157k [00:00<?, ?B/s] model.safetensors.index.json: 100%|██████████| 157k/157k [00:00<00:00, 40.6MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 11, in <module> model = LlavaForConditionalGeneration.from_pretrained(model_id).to("cuda") File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3990, in from_pretrained resolved_archive_file, sharded_metadata = get_checkpoint_shard_files( File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 1077, in get_checkpoint_shard_files shard_filenames = sorted(set(index["weight_map"].values())) KeyError: 'weight_map'

Container logs:

Fetching error logs...