Edit model card

results

This model is a fine-tuned version of facebook/detr-resnet-50 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1423

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss
No log 1.0 2 5.4207
No log 2.0 4 3.5959
No log 3.0 6 2.6546
No log 4.0 8 2.5924
4.7135 5.0 10 2.4246
4.7135 6.0 12 2.3156
4.7135 7.0 14 2.0630
4.7135 8.0 16 1.8700
4.7135 9.0 18 1.9057
1.9751 10.0 20 1.8555
1.9751 11.0 22 1.7879
1.9751 12.0 24 1.6941
1.9751 13.0 26 1.6313
1.9751 14.0 28 1.5788
1.6067 15.0 30 1.5095
1.6067 16.0 32 1.4926
1.6067 17.0 34 1.3962
1.6067 18.0 36 1.3799
1.6067 19.0 38 1.4024
1.3628 20.0 40 1.3731
1.3628 21.0 42 1.3351
1.3628 22.0 44 1.2939
1.3628 23.0 46 1.3056
1.3628 24.0 48 1.2118
1.2051 25.0 50 1.1925
1.2051 26.0 52 1.1810
1.2051 27.0 54 1.1621
1.2051 28.0 56 1.1491
1.2051 29.0 58 1.1433
1.0991 30.0 60 1.1423

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.5.0+cu121
  • Datasets 3.1.0
  • Tokenizers 0.19.1
Downloads last month
16
Safetensors
Model size
41.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Sabbasi-11/results

Finetuned
(443)
this model