011-microsoft-deberta-v3-base-finetuned-yahoo-8000_2000

This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8660
  • F1: 0.7055
  • Accuracy: 0.7045
  • Precision: 0.7076
  • Recall: 0.7045
  • System Ram Used: 4.2773
  • System Ram Total: 83.4807
  • Gpu Ram Allocated: 2.0897
  • Gpu Ram Cached: 25.8555
  • Gpu Ram Total: 39.5640
  • Gpu Utilization: 48
  • Disk Space Used: 35.8287
  • Disk Space Total: 78.1898

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss F1 Accuracy Precision Recall System Ram Used System Ram Total Gpu Ram Allocated Gpu Ram Cached Gpu Ram Total Gpu Utilization Disk Space Used Disk Space Total
1.6916 0.75 188 1.1063 0.6708 0.6755 0.6900 0.6755 4.0191 83.4807 2.0898 25.8555 39.5640 50 24.8064 78.1898
0.9694 1.5 376 0.9586 0.7181 0.7195 0.7198 0.7195 4.2536 83.4807 2.0898 25.8555 39.5640 50 29.6418 78.1898
0.8509 2.26 564 0.9748 0.7070 0.712 0.7161 0.712 4.1602 83.4807 2.0898 25.8555 39.5640 46 29.6418 78.1898
0.7475 3.01 752 0.9447 0.7122 0.714 0.7148 0.714 4.1607 83.4807 2.0898 25.8555 39.5640 50 29.6420 78.1898
0.5841 3.76 940 1.0064 0.7077 0.711 0.7225 0.711 4.1889 83.4807 2.0898 25.8555 39.5640 47 29.6420 78.1898
0.4972 4.51 1128 1.0585 0.7110 0.714 0.7129 0.714 4.1766 83.4807 2.0898 25.8555 39.5640 47 29.6421 78.1898
0.4555 5.26 1316 1.1175 0.7086 0.7075 0.7151 0.7075 4.2257 83.4807 2.0898 25.8555 39.5640 46 33.7652 78.1898
0.3535 6.02 1504 1.1749 0.7032 0.708 0.7077 0.708 4.2302 83.4807 2.0898 25.8555 39.5640 50 33.7653 78.1898
0.2614 6.77 1692 1.2028 0.7056 0.709 0.7079 0.709 4.2376 83.4807 2.0898 25.8555 39.5640 49 33.7654 78.1898
0.2321 7.52 1880 1.2961 0.7019 0.698 0.7085 0.698 4.2248 83.4807 2.0898 25.8555 39.5640 49 33.7656 78.1898
0.197 8.27 2068 1.3960 0.7098 0.712 0.7137 0.712 4.2194 83.4807 2.0898 25.8555 39.5640 45 33.7657 78.1898
0.1505 9.02 2256 1.4310 0.7093 0.7075 0.7133 0.7075 4.2418 83.4807 2.0898 25.8555 39.5640 48 35.8277 78.1898
0.1132 9.78 2444 1.5454 0.7053 0.7045 0.7097 0.7045 4.2931 83.4807 2.0898 25.8555 39.5640 48 35.8278 78.1898
0.0979 10.53 2632 1.6420 0.7090 0.708 0.7171 0.708 4.2793 83.4807 2.0898 25.8555 39.5640 45 35.8281 78.1898
0.0818 11.28 2820 1.6869 0.7062 0.7065 0.7102 0.7065 4.2822 83.4807 2.0898 25.8555 39.5640 49 35.8281 78.1898
0.062 12.03 3008 1.7818 0.7043 0.701 0.7123 0.701 4.2864 83.4807 2.0901 25.8555 39.5640 50 35.8282 78.1898
0.0433 12.78 3196 1.7981 0.7080 0.707 0.7110 0.707 4.2666 83.4807 2.0898 25.8555 39.5640 49 35.8282 78.1898
0.0368 13.54 3384 1.8403 0.7079 0.7055 0.7131 0.7055 4.2783 83.4807 2.0898 25.8555 39.5640 47 35.8285 78.1898
0.0379 14.29 3572 1.8536 0.7052 0.705 0.7074 0.705 4.3013 83.4807 2.0898 25.8555 39.5640 47 35.8286 78.1898

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
25
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for diogopaes10/011-microsoft-deberta-v3-base-finetuned-yahoo-8000_2000

Finetuned
(280)
this model