roberta-large-ner-qlorafinetune-runs-colab-16size
This model is a fine-tuned version of FacebookAI/xlm-roberta-large on the biobert_json dataset. It achieves the following results on the evaluation set:
- Loss: 1.3753
- Precision: 0.0
- Recall: 0.0
- F1: 0.0
- Accuracy: 0.7180
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use paged_adamw_8bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 1820
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
1.4712 | 0.0327 | 20 | 1.4478 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.5121 | 0.0654 | 40 | 1.3767 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.425 | 0.0980 | 60 | 1.3857 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.381 | 0.1307 | 80 | 1.3831 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4605 | 0.1634 | 100 | 1.3532 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.328 | 0.1961 | 120 | 1.3521 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4338 | 0.2288 | 140 | 1.3490 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3164 | 0.2614 | 160 | 1.3466 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4324 | 0.2941 | 180 | 1.3636 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3392 | 0.3268 | 200 | 1.3439 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4068 | 0.3595 | 220 | 1.3502 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4644 | 0.3922 | 240 | 1.3481 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4998 | 0.4248 | 260 | 1.3465 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3813 | 0.4575 | 280 | 1.3722 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3603 | 0.4902 | 300 | 1.5226 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.429 | 0.5229 | 320 | 1.3694 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.5219 | 0.5556 | 340 | 1.3454 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.2741 | 0.5882 | 360 | 1.3554 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3714 | 0.6209 | 380 | 1.3873 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3988 | 0.6536 | 400 | 1.3440 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4748 | 0.6863 | 420 | 1.3482 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.2558 | 0.7190 | 440 | 1.3444 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4824 | 0.7516 | 460 | 1.3846 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4984 | 0.7843 | 480 | 1.3440 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.426 | 0.8170 | 500 | 1.3676 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4263 | 0.8497 | 520 | 1.3471 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4002 | 0.8824 | 540 | 1.3590 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4352 | 0.9150 | 560 | 1.3595 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4364 | 0.9477 | 580 | 1.4145 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3128 | 0.9804 | 600 | 1.3838 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3831 | 1.0131 | 620 | 1.3464 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4068 | 1.0458 | 640 | 1.4084 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.2825 | 1.0784 | 660 | 1.4334 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3083 | 1.1111 | 680 | 1.4108 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4875 | 1.1438 | 700 | 1.3736 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3941 | 1.1765 | 720 | 1.3658 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3525 | 1.2092 | 740 | 1.3649 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4536 | 1.2418 | 760 | 1.3924 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3456 | 1.2745 | 780 | 1.3721 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.5239 | 1.3072 | 800 | 1.3611 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4345 | 1.3399 | 820 | 1.3671 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3366 | 1.3725 | 840 | 1.4022 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3953 | 1.4052 | 860 | 1.3548 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4219 | 1.4379 | 880 | 1.3456 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3604 | 1.4706 | 900 | 1.3846 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4112 | 1.5033 | 920 | 1.3940 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4129 | 1.5359 | 940 | 1.3471 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3603 | 1.5686 | 960 | 1.3633 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4022 | 1.6013 | 980 | 1.3745 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.325 | 1.6340 | 1000 | 1.3902 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.332 | 1.6667 | 1020 | 1.4242 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.418 | 1.6993 | 1040 | 1.3596 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3788 | 1.7320 | 1060 | 1.3658 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3202 | 1.7647 | 1080 | 1.3776 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4051 | 1.7974 | 1100 | 1.3883 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3925 | 1.8301 | 1120 | 1.3487 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.372 | 1.8627 | 1140 | 1.3936 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.5219 | 1.8954 | 1160 | 1.3533 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4067 | 1.9281 | 1180 | 1.3808 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4154 | 1.9608 | 1200 | 1.3599 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3836 | 1.9935 | 1220 | 1.3601 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4528 | 2.0261 | 1240 | 1.3507 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3977 | 2.0588 | 1260 | 1.3915 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4612 | 2.0915 | 1280 | 1.3626 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.2652 | 2.1242 | 1300 | 1.4164 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.2759 | 2.1569 | 1320 | 1.3706 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.364 | 2.1895 | 1340 | 1.3713 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3539 | 2.2222 | 1360 | 1.3788 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4394 | 2.2549 | 1380 | 1.3606 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3256 | 2.2876 | 1400 | 1.3903 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3907 | 2.3203 | 1420 | 1.3634 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3354 | 2.3529 | 1440 | 1.3979 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4186 | 2.3856 | 1460 | 1.3519 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4696 | 2.4183 | 1480 | 1.3653 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3621 | 2.4510 | 1500 | 1.3844 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4385 | 2.4837 | 1520 | 1.3634 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3791 | 2.5163 | 1540 | 1.3723 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4135 | 2.5490 | 1560 | 1.3638 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4469 | 2.5817 | 1580 | 1.3855 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4348 | 2.6144 | 1600 | 1.3709 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3417 | 2.6471 | 1620 | 1.3837 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3618 | 2.6797 | 1640 | 1.3864 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.352 | 2.7124 | 1660 | 1.3839 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4488 | 2.7451 | 1680 | 1.3632 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.4843 | 2.7778 | 1700 | 1.3522 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3993 | 2.8105 | 1720 | 1.3656 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3753 | 2.8431 | 1740 | 1.3763 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.411 | 2.8758 | 1760 | 1.3726 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3388 | 2.9085 | 1780 | 1.3724 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.3951 | 2.9412 | 1800 | 1.3731 | 0.0 | 0.0 | 0.0 | 0.7180 |
1.2433 | 2.9739 | 1820 | 1.3753 | 0.0 | 0.0 | 0.0 | 0.7180 |
Framework versions
- PEFT 0.13.2
- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.20.3
- Downloads last month
- 20
Model tree for brandonRivas/roberta-large-ner-qlorafinetune-runs-colab-16size
Base model
FacebookAI/xlm-roberta-large