roberta-large-ner-qlorafinetune-runs-colab-32size
This model is a fine-tuned version of FacebookAI/xlm-roberta-large on the biobert_json dataset. It achieves the following results on the evaluation set:
- Loss: 1.3811
- Precision: 0.0
- Recall: 0.0
- F1: 0.0
- Accuracy: 0.7180
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use paged_adamw_8bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 1820
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
2.6574 | 0.0654 | 20 | 1.4181 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3164 | 0.1307 | 40 | 0.9444 | 0.5677 | 0.2339 | 0.3313 | 0.7707 |
0.7519 | 0.1961 | 60 | 0.4053 | 0.6857 | 0.6773 | 0.6815 | 0.8896 |
0.3851 | 0.2614 | 80 | 0.3130 | 0.6785 | 0.8320 | 0.7474 | 0.9074 |
0.2798 | 0.3268 | 100 | 0.1994 | 0.8044 | 0.8875 | 0.8439 | 0.9444 |
0.2741 | 0.3922 | 120 | 0.1540 | 0.8440 | 0.8833 | 0.8632 | 0.9535 |
0.2007 | 0.4575 | 140 | 0.1244 | 0.8975 | 0.8890 | 0.8933 | 0.9625 |
0.1774 | 0.5229 | 160 | 0.1287 | 0.8733 | 0.9233 | 0.8976 | 0.9649 |
0.1743 | 0.5882 | 180 | 0.1260 | 0.8708 | 0.9317 | 0.9002 | 0.9637 |
0.1342 | 0.6536 | 200 | 0.1489 | 0.8818 | 0.9183 | 0.8997 | 0.9641 |
0.1503 | 0.7190 | 220 | 0.1026 | 0.9066 | 0.9060 | 0.9063 | 0.9680 |
0.1428 | 0.7843 | 240 | 0.1164 | 0.8831 | 0.9411 | 0.9112 | 0.9666 |
0.1195 | 0.8497 | 260 | 0.0928 | 0.9118 | 0.9390 | 0.9252 | 0.9737 |
0.1258 | 0.9150 | 280 | 0.1253 | 0.8728 | 0.9311 | 0.9010 | 0.9633 |
0.1426 | 0.9804 | 300 | 0.0987 | 0.9030 | 0.9368 | 0.9196 | 0.9698 |
0.1199 | 1.0458 | 320 | 0.0908 | 0.9057 | 0.9417 | 0.9234 | 0.9727 |
0.0952 | 1.1111 | 340 | 0.1012 | 0.8972 | 0.9489 | 0.9223 | 0.9688 |
0.8406 | 1.1765 | 360 | 1.8325 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4909 | 1.2418 | 380 | 1.3488 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4882 | 1.3072 | 400 | 1.3836 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4205 | 1.3725 | 420 | 1.4048 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4317 | 1.4379 | 440 | 1.3429 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4078 | 1.5033 | 460 | 1.3754 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4103 | 1.5686 | 480 | 1.3568 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3914 | 1.6340 | 500 | 1.3496 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4035 | 1.6993 | 520 | 1.3534 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3792 | 1.7647 | 540 | 1.3515 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4112 | 1.8301 | 560 | 1.3424 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4609 | 1.8954 | 580 | 1.3527 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4331 | 1.9608 | 600 | 1.3461 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4255 | 2.0261 | 620 | 1.3464 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.434 | 2.0915 | 640 | 1.3594 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.2786 | 2.1569 | 660 | 1.3890 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3957 | 2.2222 | 680 | 1.3983 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4165 | 2.2876 | 700 | 1.3707 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3678 | 2.3529 | 720 | 1.3762 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4533 | 2.4183 | 740 | 1.3453 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.41 | 2.4837 | 760 | 1.3401 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4114 | 2.5490 | 780 | 1.3681 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4556 | 2.6144 | 800 | 1.3682 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3674 | 2.6797 | 820 | 1.3747 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4153 | 2.7451 | 840 | 1.3538 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4545 | 2.8105 | 860 | 1.3483 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4072 | 2.8758 | 880 | 1.3413 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3722 | 2.9412 | 900 | 1.3550 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3143 | 3.0065 | 920 | 1.3483 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3608 | 3.0719 | 940 | 1.3673 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3687 | 3.1373 | 960 | 1.3510 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3459 | 3.2026 | 980 | 1.3556 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4105 | 3.2680 | 1000 | 1.3535 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.365 | 3.3333 | 1020 | 1.3822 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4815 | 3.3987 | 1040 | 1.3640 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4073 | 3.4641 | 1060 | 1.3401 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4105 | 3.5294 | 1080 | 1.3532 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3636 | 3.5948 | 1100 | 1.3716 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3118 | 3.6601 | 1120 | 1.3949 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4132 | 3.7255 | 1140 | 1.3732 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3842 | 3.7908 | 1160 | 1.3795 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3669 | 3.8562 | 1180 | 1.3607 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4412 | 3.9216 | 1200 | 1.3613 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4812 | 3.9869 | 1220 | 1.3724 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3959 | 4.0523 | 1240 | 1.3770 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4439 | 4.1176 | 1260 | 1.3692 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3897 | 4.1830 | 1280 | 1.3421 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3845 | 4.2484 | 1300 | 1.3522 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3326 | 4.3137 | 1320 | 1.3536 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3523 | 4.3791 | 1340 | 1.4286 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3851 | 4.4444 | 1360 | 1.3639 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.355 | 4.5098 | 1380 | 1.3625 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3818 | 4.5752 | 1400 | 1.3791 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4169 | 4.6405 | 1420 | 1.3782 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4191 | 4.7059 | 1440 | 1.3663 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.417 | 4.7712 | 1460 | 1.3411 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4615 | 4.8366 | 1480 | 1.3632 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3694 | 4.9020 | 1500 | 1.3723 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.38 | 4.9673 | 1520 | 1.3482 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4326 | 5.0327 | 1540 | 1.3972 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3453 | 5.0980 | 1560 | 1.3706 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3799 | 5.1634 | 1580 | 1.3655 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4038 | 5.2288 | 1600 | 1.3636 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3686 | 5.2941 | 1620 | 1.3764 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3959 | 5.3595 | 1640 | 1.3638 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4266 | 5.4248 | 1660 | 1.3625 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4607 | 5.4902 | 1680 | 1.3582 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3667 | 5.5556 | 1700 | 1.3868 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.424 | 5.6209 | 1720 | 1.3655 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3924 | 5.6863 | 1740 | 1.3636 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4126 | 5.7516 | 1760 | 1.3672 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4733 | 5.8170 | 1780 | 1.3594 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3216 | 5.8824 | 1800 | 1.3701 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.2697 | 5.9477 | 1820 | 1.3811 | 0.0 | 0.0 | 0.0 | 0.7180 |
Framework versions
- PEFT 0.13.2
- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.20.3
- Downloads last month
- 22
Model tree for brandonRivas/roberta-large-ner-qlorafinetune-runs-colab-32size
Base model
FacebookAI/xlm-roberta-large