runtime error

Exit code: 1. Reason: The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`. 0it [00:00, ?it/s] 0it [00:00, ?it/s] /usr/local/lib/python3.10/site-packages/diffusers/models/transformers/transformer_2d.py:34: FutureWarning: `Transformer2DModelOutput` is deprecated and will be removed in version 1.0.0. Importing `Transformer2DModelOutput` from `diffusers.models.transformer_2d` is deprecated and this will be removed in a future version. Please use `from diffusers.models.modeling_outputs import Transformer2DModelOutput`, instead. deprecate("Transformer2DModelOutput", "1.0.0", deprecation_message) Traceback (most recent call last): File "/home/user/app/app.py", line 187, in <module> vae = load_vae(vae_dir) File "/home/user/app/app.py", line 57, in load_vae vae = CausalVideoAutoencoder.from_config(vae_config) File "/home/user/app/xora/models/autoencoders/causal_video_autoencoder.py", line 65, in from_config config["_class_name"] == "CausalVideoAutoencoder" AssertionError: config must have _class_name=CausalVideoAutoencoder

Container logs:

Fetching error logs...