Edit model card

invitrace-ilivewell-freeze-layer11

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3049
  • Accuracy: 0.7194

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
5.0728 0.0803 200 5.0558 0.0417
4.9729 0.1605 400 4.9402 0.0865
4.8315 0.2408 600 4.8201 0.1738
4.6814 0.3210 800 4.6929 0.1999
4.5936 0.4013 1000 4.5779 0.2296
4.5513 0.4815 1200 4.4604 0.2645
4.3161 0.5618 1400 4.3466 0.2976
4.2724 0.6421 1600 4.2445 0.3472
4.1105 0.7223 1800 4.1402 0.3530
4.076 0.8026 2000 4.0522 0.3703
3.9963 0.8828 2200 3.9616 0.3803
3.9235 0.9631 2400 3.8737 0.4122
3.5191 1.0433 2600 3.7816 0.4142
3.6156 1.1236 2800 3.6908 0.4345
3.4338 1.2039 3000 3.6081 0.4487
3.4952 1.2841 3200 3.5293 0.4792
3.369 1.3644 3400 3.4345 0.4836
3.3625 1.4446 3600 3.3572 0.5065
3.3323 1.5249 3800 3.2832 0.5160
3.2143 1.6051 4000 3.2152 0.5035
3.0538 1.6854 4200 3.1389 0.5212
3.0841 1.7657 4400 3.0654 0.5418
2.9804 1.8459 4600 2.9880 0.5493
3.0014 1.9262 4800 2.9356 0.5449
3.0302 2.0064 5000 2.8573 0.5585
2.714 2.0867 5200 2.7933 0.5667
2.6886 2.1669 5400 2.7398 0.5649
2.4439 2.2472 5600 2.6725 0.5802
2.3172 2.3274 5800 2.6106 0.5836
2.3907 2.4077 6000 2.5523 0.5896
2.617 2.4880 6200 2.4995 0.5942
2.1277 2.5682 6400 2.4401 0.6041
2.2384 2.6485 6600 2.3903 0.6113
2.2872 2.7287 6800 2.3348 0.6163
2.1654 2.8090 7000 2.3031 0.6083
2.1407 2.8892 7200 2.2365 0.6239
2.0487 2.9695 7400 2.2016 0.6179
1.9755 3.0498 7600 2.1587 0.6281
1.7432 3.1300 7800 2.1125 0.6382
1.7321 3.2103 8000 2.0822 0.6438
1.4812 3.2905 8200 2.0482 0.6390
1.637 3.3708 8400 1.9964 0.6556
1.538 3.4510 8600 1.9785 0.6472
1.9383 3.5313 8800 1.9220 0.6602
1.5674 3.6116 9000 1.9167 0.6558
1.6303 3.6918 9200 1.8803 0.6629
1.3882 3.7721 9400 1.8531 0.6619
1.5955 3.8523 9600 1.8092 0.6739
1.5168 3.9326 9800 1.7814 0.6763
1.5228 4.0128 10000 1.7617 0.6697
1.3918 4.0931 10200 1.7434 0.6771
1.3898 4.1734 10400 1.6937 0.6769
1.3597 4.2536 10600 1.6928 0.6801
1.3249 4.3339 10800 1.6636 0.6807
1.361 4.4141 11000 1.6585 0.6829
1.2845 4.4944 11200 1.6296 0.6887
1.2342 4.5746 11400 1.6049 0.6928
1.1281 4.6549 11600 1.5856 0.6948
1.2667 4.7352 11800 1.5775 0.6905
1.3742 4.8154 12000 1.5698 0.6911
1.076 4.8957 12200 1.5423 0.6942
1.2422 4.9759 12400 1.5282 0.6970
0.9078 5.0562 12600 1.5109 0.6992
1.0157 5.1364 12800 1.4908 0.7010
1.1909 5.2167 13000 1.4917 0.7006
1.0085 5.2970 13200 1.4804 0.6996
1.0942 5.3772 13400 1.4662 0.7024
0.9015 5.4575 13600 1.4785 0.6952
0.899 5.5377 13800 1.4409 0.7076
1.1695 5.6180 14000 1.4347 0.7074
0.9743 5.6982 14200 1.4381 0.7084
0.9005 5.7785 14400 1.4145 0.7086
1.0092 5.8587 14600 1.4067 0.7128
0.9859 5.9390 14800 1.3790 0.7186
0.8728 6.0193 15000 1.3951 0.7138
0.8551 6.0995 15200 1.3765 0.7211
0.8369 6.1798 15400 1.3751 0.7154
0.8989 6.2600 15600 1.3641 0.7168
0.7289 6.3403 15800 1.3701 0.7162
0.7181 6.4205 16000 1.3661 0.7088
0.7517 6.5008 16200 1.3528 0.7136
1.0271 6.5811 16400 1.3405 0.7200
0.8599 6.6613 16600 1.3296 0.7215
1.0141 6.7416 16800 1.3379 0.7190
0.6966 6.8218 17000 1.3294 0.7194
0.9327 6.9021 17200 1.3241 0.7198
0.8072 6.9823 17400 1.3226 0.7196
0.9195 7.0626 17600 1.3234 0.7170
0.6585 7.1429 17800 1.3171 0.7207
0.9513 7.2231 18000 1.3064 0.7190
0.7139 7.3034 18200 1.3156 0.7215
0.7199 7.3836 18400 1.3098 0.7249
0.7799 7.4639 18600 1.3210 0.7166
0.7034 7.5441 18800 1.3015 0.7245
0.8172 7.6244 19000 1.2978 0.7289
0.6842 7.7047 19200 1.3084 0.7184
0.8592 7.7849 19400 1.2991 0.7231
0.7255 7.8652 19600 1.2929 0.7257
0.8207 7.9454 19800 1.3049 0.7194

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
12
Safetensors
Model size
85.9M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Invitrace/I-live-well-foodai-freeze-layer11

Finetuned
(1686)
this model

Evaluation results