MAE-CT-M1N0-M12_v8_split2

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5597
  • Accuracy: 0.7260

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 6200

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6375 0.0102 63 0.6993 0.5161
0.7565 1.0102 126 0.7252 0.5161
0.6926 2.0102 189 0.7296 0.5161
0.5636 3.0102 252 0.7618 0.5161
0.4721 4.0102 315 1.2407 0.5161
0.7569 5.0102 378 0.8010 0.5161
0.384 6.0102 441 0.6036 0.6774
0.6542 7.0102 504 0.5455 0.8065
0.3615 8.0102 567 1.3506 0.5484
0.2246 9.0102 630 1.5499 0.5806
0.7929 10.0102 693 1.0719 0.6452
0.5963 11.0102 756 0.9215 0.6129
0.1342 12.0102 819 0.9188 0.6452
0.2511 13.0102 882 1.4410 0.6452
0.5877 14.0102 945 2.3550 0.5161
0.3261 15.0102 1008 1.0729 0.6774
0.0425 16.0102 1071 2.5330 0.5806
0.174 17.0102 1134 2.8190 0.5806
0.1972 18.0102 1197 2.4491 0.5484
0.2264 19.0102 1260 1.9345 0.6774
0.0862 20.0102 1323 3.3695 0.5161
0.0998 21.0102 1386 1.4091 0.7742
0.311 22.0102 1449 2.7629 0.5484
0.0481 23.0102 1512 2.0506 0.6452
0.2109 24.0102 1575 2.5990 0.5806
0.179 25.0102 1638 2.7815 0.5806
0.0002 26.0102 1701 3.6719 0.5161
0.0996 27.0102 1764 3.7618 0.5161
0.0002 28.0102 1827 3.3375 0.5484
0.0004 29.0102 1890 2.8750 0.6129
0.0001 30.0102 1953 2.5867 0.6774
0.1188 31.0102 2016 1.8263 0.6774
0.0295 32.0102 2079 3.2699 0.5806
0.1931 33.0102 2142 3.3532 0.5806
0.0002 34.0102 2205 4.2001 0.5161
0.0001 35.0102 2268 3.3819 0.5484
0.0001 36.0102 2331 2.2776 0.7097
0.0007 37.0102 2394 2.8516 0.5806
0.0001 38.0102 2457 4.0420 0.5161
0.0002 39.0102 2520 2.5901 0.6452
0.0001 40.0102 2583 3.5043 0.5806
0.0001 41.0102 2646 3.5424 0.5806
0.0001 42.0102 2709 3.8740 0.5484
0.0001 43.0102 2772 3.5726 0.5806
0.0004 44.0102 2835 3.2184 0.5806
0.0 45.0102 2898 3.3347 0.5806
0.0001 46.0102 2961 3.8206 0.5806
0.0 47.0102 3024 3.7951 0.5484
0.0 48.0102 3087 2.7604 0.6774
0.0 49.0102 3150 4.3949 0.5484
0.0 50.0102 3213 2.8947 0.6774
0.0 51.0102 3276 4.2413 0.5161
0.1268 52.0102 3339 2.3339 0.7097
0.0 53.0102 3402 3.4769 0.6129
0.0 54.0102 3465 3.5142 0.5806
0.0 55.0102 3528 3.5718 0.5161
0.0036 56.0102 3591 4.1867 0.4839
0.0026 57.0102 3654 2.7411 0.6452
0.0 58.0102 3717 4.0464 0.5484
0.0001 59.0102 3780 3.6255 0.5806
0.0 60.0102 3843 4.7292 0.5161
0.1406 61.0102 3906 3.9876 0.5806
0.0 62.0102 3969 3.4099 0.6129
0.0 63.0102 4032 3.2674 0.5806
0.0 64.0102 4095 3.9749 0.5806
0.0 65.0102 4158 3.3262 0.6129
0.0 66.0102 4221 2.5556 0.7097
0.2639 67.0102 4284 3.6954 0.6129
0.0011 68.0102 4347 3.2776 0.5806
0.0 69.0102 4410 3.6620 0.5806
0.0 70.0102 4473 3.5887 0.5806
0.0 71.0102 4536 4.5040 0.5484
0.0 72.0102 4599 3.8666 0.5484
0.0 73.0102 4662 4.0017 0.5484
0.0 74.0102 4725 3.9422 0.5484
0.0001 75.0102 4788 4.5397 0.5484
0.0 76.0102 4851 3.8405 0.5806
0.0 77.0102 4914 3.9992 0.5806
0.0 78.0102 4977 3.9722 0.5806
0.0 79.0102 5040 3.9421 0.5806
0.2333 80.0102 5103 4.0817 0.5484
0.0 81.0102 5166 3.6669 0.6129
0.0 82.0102 5229 3.6607 0.6129
0.0 83.0102 5292 3.6873 0.6129
0.0 84.0102 5355 4.5979 0.5484
0.0 85.0102 5418 3.8881 0.5806
0.0 86.0102 5481 4.6014 0.5484
0.0 87.0102 5544 3.7988 0.6129
0.0 88.0102 5607 3.8047 0.6129
0.0 89.0102 5670 3.8106 0.6129
0.0001 90.0102 5733 4.1468 0.5806
0.0 91.0102 5796 4.3700 0.5484
0.0 92.0102 5859 4.2392 0.5484
0.0 93.0102 5922 4.2085 0.5484
0.0 94.0102 5985 4.2017 0.5806
0.0 95.0102 6048 4.1972 0.5806
0.0 96.0102 6111 4.1925 0.5806
0.0 97.0102 6174 4.1911 0.5806
0.0 98.0042 6200 4.1910 0.5806

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
13
Safetensors
Model size
304M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for beingbatman/MAE-CT-M1N0-M12_v8_split2

Finetuned
(19)
this model