Edit model card

hushem_1x_deit_base_sgd_001_fold3

This model is a fine-tuned version of facebook/deit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3334
  • Accuracy: 0.4186

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.3830 0.3721
1.4361 2.0 12 1.3798 0.3256
1.4361 3.0 18 1.3766 0.3256
1.4161 4.0 24 1.3738 0.3023
1.4308 5.0 30 1.3713 0.3256
1.4308 6.0 36 1.3692 0.3488
1.4008 7.0 42 1.3671 0.3721
1.4008 8.0 48 1.3650 0.3721
1.3927 9.0 54 1.3634 0.3721
1.37 10.0 60 1.3616 0.3721
1.37 11.0 66 1.3601 0.3953
1.3698 12.0 72 1.3586 0.3953
1.3698 13.0 78 1.3569 0.4419
1.3588 14.0 84 1.3553 0.4186
1.3582 15.0 90 1.3538 0.4186
1.3582 16.0 96 1.3522 0.4186
1.3505 17.0 102 1.3510 0.4186
1.3505 18.0 108 1.3496 0.4186
1.3337 19.0 114 1.3483 0.4186
1.3364 20.0 120 1.3470 0.4186
1.3364 21.0 126 1.3459 0.4186
1.3337 22.0 132 1.3449 0.4186
1.3337 23.0 138 1.3437 0.4186
1.3242 24.0 144 1.3426 0.3953
1.3247 25.0 150 1.3416 0.3953
1.3247 26.0 156 1.3407 0.3953
1.314 27.0 162 1.3398 0.3953
1.314 28.0 168 1.3390 0.3953
1.3132 29.0 174 1.3382 0.3953
1.3139 30.0 180 1.3375 0.3953
1.3139 31.0 186 1.3369 0.4186
1.3104 32.0 192 1.3363 0.3953
1.3104 33.0 198 1.3357 0.3953
1.3084 34.0 204 1.3352 0.3953
1.3046 35.0 210 1.3348 0.4186
1.3046 36.0 216 1.3345 0.4186
1.3016 37.0 222 1.3341 0.4186
1.3016 38.0 228 1.3339 0.4186
1.3084 39.0 234 1.3337 0.4186
1.3045 40.0 240 1.3335 0.4186
1.3045 41.0 246 1.3335 0.4186
1.2931 42.0 252 1.3334 0.4186
1.2931 43.0 258 1.3334 0.4186
1.3105 44.0 264 1.3334 0.4186
1.2967 45.0 270 1.3334 0.4186
1.2967 46.0 276 1.3334 0.4186
1.3024 47.0 282 1.3334 0.4186
1.3024 48.0 288 1.3334 0.4186
1.3038 49.0 294 1.3334 0.4186
1.3026 50.0 300 1.3334 0.4186

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.7
  • Tokenizers 0.15.0
Downloads last month
6
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/hushem_1x_deit_base_sgd_001_fold3

Finetuned
(261)
this model

Evaluation results