Zamoranesis's picture
clinical_transcripts_roberta_distilled
42a0569
|
raw
history blame
3.44 kB
metadata
base_model: Zamoranesis/clinical_transcripts_roberta
tags:
  - generated_from_trainer
model-index:
  - name: clinical_transcripts_roberta_distilled
    results: []

clinical_transcripts_roberta_distilled

This model is a fine-tuned version of Zamoranesis/clinical_transcripts_roberta on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 106.7275

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • lr_scheduler_warmup_steps: 100
  • training_steps: 4000

Training results

Training Loss Epoch Step Validation Loss
276.9792 0.31 100 198.7813
202.4796 0.63 200 187.2836
187.377 0.94 300 168.7091
171.4996 1.25 400 171.7099
162.2323 1.57 500 162.8250
165.8647 1.88 600 161.5869
151.5353 2.19 700 151.7282
145.5493 2.51 800 146.7330
148.6375 2.82 900 154.1694
143.963 3.13 1000 152.8062
141.3649 3.45 1100 149.8849
134.6703 3.76 1200 141.6481
136.8365 4.08 1300 142.7610
127.3401 4.39 1400 134.1384
128.1466 4.7 1500 137.5692
130.1291 5.02 1600 131.3674
123.2371 5.33 1700 133.7921
128.0497 5.64 1800 137.7079
121.8081 5.96 1900 131.1400
118.9764 6.27 2000 136.2727
111.3325 6.58 2100 125.8130
112.32 6.9 2200 122.2134
110.8909 7.21 2300 126.7264
113.9796 7.52 2400 121.6689
109.1709 7.84 2500 123.6003
103.981 8.15 2600 115.3986
99.9035 8.46 2700 118.1729
102.7026 8.78 2800 116.7197
102.889 9.09 2900 110.3246
97.2037 9.4 3000 111.4095
96.6495 9.72 3100 110.4597
91.2564 10.03 3200 114.8320
93.1662 10.34 3300 112.2192
94.8274 10.66 3400 108.9920
91.7985 10.97 3500 106.0877
92.6536 11.29 3600 101.6935
85.6407 11.6 3700 103.1658
88.6192 11.91 3800 98.9863
87.0916 12.23 3900 102.7780
84.1347 12.54 4000 106.7275

Framework versions

  • Transformers 4.34.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1