xml-roberta-large-ner-qlorafinetune-runs-colab-16size
This model is a fine-tuned version of FacebookAI/xlm-roberta-large on the biobert_json dataset. It achieves the following results on the evaluation set:
- Loss: 0.0703
- Precision: 0.9398
- Recall: 0.9601
- F1: 0.9499
- Accuracy: 0.9819
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use paged_adamw_8bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 2447
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Accuracy | F1 | Validation Loss | Precision | Recall |
---|---|---|---|---|---|---|---|
2.413 | 0.0327 | 20 | 0.7180 | 0.0 | 1.2561 | 0.0 | 0.0 |
1.3486 | 0.0654 | 40 | 0.7382 | 0.1008 | 1.0226 | 0.4583 | 0.0566 |
0.9276 | 0.0980 | 60 | 0.8160 | 0.5172 | 0.8356 | 0.7047 | 0.4085 |
0.6383 | 0.1307 | 80 | 0.8990 | 0.7437 | 0.3561 | 0.7796 | 0.7111 |
0.3991 | 0.1634 | 100 | 0.9164 | 0.7881 | 0.3146 | 0.7408 | 0.8418 |
0.3842 | 0.1961 | 120 | 0.9373 | 0.8204 | 0.2230 | 0.7971 | 0.8450 |
0.3549 | 0.2288 | 140 | 0.9250 | 0.7865 | 0.2281 | 0.7362 | 0.8442 |
0.281 | 0.2614 | 160 | 0.9482 | 0.8577 | 0.1842 | 0.8711 | 0.8446 |
0.2698 | 0.2941 | 180 | 0.9493 | 0.8631 | 0.1739 | 0.8764 | 0.8503 |
0.2113 | 0.3268 | 200 | 0.9521 | 0.8484 | 0.1578 | 0.8504 | 0.8464 |
0.2151 | 0.3595 | 220 | 0.9608 | 0.8936 | 0.1420 | 0.8876 | 0.8997 |
0.2492 | 0.3922 | 240 | 0.9598 | 0.8865 | 0.1524 | 0.8738 | 0.8997 |
0.2158 | 0.4248 | 260 | 0.9597 | 0.8903 | 0.1373 | 0.8951 | 0.8855 |
0.1802 | 0.4575 | 280 | 0.9621 | 0.8931 | 0.1363 | 0.8860 | 0.9003 |
0.1639 | 0.4902 | 300 | 0.9653 | 0.9034 | 0.1168 | 0.8874 | 0.9200 |
0.1798 | 0.5229 | 320 | 0.9616 | 0.8894 | 0.1228 | 0.8882 | 0.8906 |
0.2096 | 0.5556 | 340 | 0.9581 | 0.8797 | 0.1361 | 0.8584 | 0.9020 |
0.1619 | 0.5882 | 360 | 0.9648 | 0.9020 | 0.1195 | 0.8876 | 0.9168 |
0.1476 | 0.6209 | 380 | 0.9666 | 0.9050 | 0.1139 | 0.9065 | 0.9035 |
0.1575 | 0.6536 | 400 | 0.9674 | 0.9090 | 0.1082 | 0.9003 | 0.9178 |
0.1567 | 0.6863 | 420 | 0.9590 | 0.8975 | 0.1368 | 0.8615 | 0.9366 |
0.1529 | 0.7190 | 440 | 0.9657 | 0.8990 | 0.1175 | 0.9131 | 0.8853 |
0.1629 | 0.7516 | 460 | 0.9491 | 0.8871 | 0.1589 | 0.8395 | 0.9404 |
0.1626 | 0.7843 | 480 | 0.9705 | 0.9219 | 0.1017 | 0.8992 | 0.9458 |
0.1606 | 0.8170 | 500 | 0.9623 | 0.8908 | 0.1199 | 0.8788 | 0.9032 |
0.1259 | 0.8497 | 520 | 0.9655 | 0.9009 | 0.1181 | 0.8827 | 0.9198 |
0.1482 | 0.8824 | 540 | 0.9620 | 0.8985 | 0.1268 | 0.8798 | 0.9180 |
0.1562 | 0.9150 | 560 | 0.9678 | 0.9153 | 0.1115 | 0.9030 | 0.9279 |
0.1503 | 0.9477 | 580 | 0.9696 | 0.9155 | 0.1029 | 0.8893 | 0.9433 |
0.1446 | 0.9804 | 600 | 0.9747 | 0.9281 | 0.0880 | 0.9248 | 0.9315 |
0.1362 | 1.0131 | 620 | 0.9740 | 0.9247 | 0.0925 | 0.9041 | 0.9463 |
0.1059 | 1.0458 | 640 | 0.9719 | 0.9209 | 0.1019 | 0.9003 | 0.9425 |
0.0956 | 1.0784 | 660 | 0.9720 | 0.9219 | 0.1058 | 0.9221 | 0.9217 |
0.1068 | 1.1111 | 680 | 0.9729 | 0.9262 | 0.0935 | 0.9085 | 0.9447 |
0.1001 | 1.1438 | 700 | 0.9715 | 0.9195 | 0.1027 | 0.9001 | 0.9397 |
0.1144 | 1.1765 | 720 | 0.9721 | 0.9238 | 0.0896 | 0.9111 | 0.9368 |
0.0917 | 1.2092 | 740 | 0.9770 | 0.9332 | 0.0824 | 0.9279 | 0.9385 |
0.1192 | 1.2418 | 760 | 0.9742 | 0.9317 | 0.0909 | 0.9197 | 0.9440 |
0.1283 | 1.2745 | 780 | 0.9725 | 0.9186 | 0.0887 | 0.9103 | 0.9271 |
0.0939 | 1.3072 | 800 | 0.9755 | 0.9317 | 0.0864 | 0.9261 | 0.9373 |
0.1064 | 1.3399 | 820 | 0.9742 | 0.9260 | 0.0878 | 0.9115 | 0.9410 |
0.1176 | 1.3725 | 840 | 0.9753 | 0.9231 | 0.0813 | 0.9147 | 0.9317 |
0.0973 | 1.4052 | 860 | 0.9793 | 0.9384 | 0.0728 | 0.9201 | 0.9574 |
0.0977 | 1.4379 | 880 | 0.9775 | 0.9367 | 0.0847 | 0.9222 | 0.9518 |
0.117 | 1.4706 | 900 | 0.9752 | 0.9338 | 0.0830 | 0.9134 | 0.9552 |
0.0808 | 1.5033 | 920 | 0.9763 | 0.9370 | 0.0797 | 0.9183 | 0.9566 |
0.0856 | 1.5359 | 940 | 0.9767 | 0.9340 | 0.0834 | 0.9173 | 0.9512 |
0.1172 | 1.5686 | 960 | 0.9733 | 0.9280 | 0.0897 | 0.9119 | 0.9446 |
0.0938 | 1.6013 | 980 | 0.9785 | 0.9381 | 0.0769 | 0.9263 | 0.9503 |
0.0872 | 1.6340 | 1000 | 0.9759 | 0.9353 | 0.0850 | 0.9177 | 0.9534 |
0.0702 | 1.6667 | 1020 | 0.9762 | 0.9382 | 0.0857 | 0.9217 | 0.9552 |
0.0787 | 1.6993 | 1040 | 0.9764 | 0.9355 | 0.0806 | 0.9158 | 0.9560 |
0.0678 | 1.7320 | 1060 | 0.9780 | 0.9420 | 0.0803 | 0.9252 | 0.9594 |
0.0909 | 1.7647 | 1080 | 0.9755 | 0.9260 | 0.0813 | 0.9066 | 0.9461 |
0.0879 | 1.7974 | 1100 | 0.9736 | 0.9187 | 0.0780 | 0.9186 | 0.9189 |
0.102 | 1.8301 | 1120 | 0.9762 | 0.9340 | 0.0827 | 0.9188 | 0.9497 |
0.076 | 1.8627 | 1140 | 0.9793 | 0.9390 | 0.0751 | 0.9272 | 0.9510 |
0.0952 | 1.8954 | 1160 | 0.9762 | 0.9316 | 0.0809 | 0.9162 | 0.9476 |
0.0825 | 1.9281 | 1180 | 0.9723 | 0.9253 | 0.0925 | 0.9049 | 0.9466 |
0.0979 | 1.9608 | 1200 | 0.9753 | 0.9317 | 0.0898 | 0.9102 | 0.9543 |
0.1007 | 1.9935 | 1220 | 0.9783 | 0.9417 | 0.0808 | 0.9247 | 0.9594 |
0.0611 | 2.0261 | 1240 | 0.9804 | 0.9453 | 0.0703 | 0.9370 | 0.9537 |
0.0668 | 2.0588 | 1260 | 0.9780 | 0.9416 | 0.0788 | 0.9248 | 0.9591 |
0.0637 | 2.0915 | 1280 | 0.9800 | 0.9450 | 0.0735 | 0.9297 | 0.9609 |
0.0472 | 2.1242 | 1300 | 0.9792 | 0.9405 | 0.0738 | 0.9266 | 0.9548 |
0.0577 | 2.1569 | 1320 | 0.9804 | 0.9446 | 0.0689 | 0.9358 | 0.9537 |
0.0964 | 2.1895 | 1340 | 0.9756 | 0.9295 | 0.0807 | 0.9079 | 0.9522 |
0.0556 | 2.2222 | 1360 | 0.9793 | 0.9421 | 0.0732 | 0.9289 | 0.9557 |
0.0599 | 2.2549 | 1380 | 0.9792 | 0.9437 | 0.0759 | 0.9303 | 0.9574 |
0.0698 | 2.2876 | 1400 | 0.9760 | 0.9323 | 0.0829 | 0.9156 | 0.9496 |
0.0586 | 2.3203 | 1420 | 0.0766 | 0.9270 | 0.9611 | 0.9438 | 0.9785 |
0.0545 | 2.3529 | 1440 | 0.0728 | 0.9375 | 0.9475 | 0.9424 | 0.9801 |
0.064 | 2.3856 | 1460 | 0.0762 | 0.9304 | 0.9628 | 0.9463 | 0.9797 |
0.0633 | 2.4183 | 1480 | 0.0756 | 0.9336 | 0.9519 | 0.9426 | 0.9789 |
0.088 | 2.4510 | 1500 | 0.0778 | 0.9156 | 0.9579 | 0.9362 | 0.9772 |
0.0737 | 2.4837 | 1520 | 0.0694 | 0.9353 | 0.9589 | 0.9470 | 0.9812 |
0.0619 | 2.5163 | 1540 | 0.0699 | 0.9348 | 0.9577 | 0.9461 | 0.9810 |
0.0683 | 2.5490 | 1560 | 0.0705 | 0.9340 | 0.9593 | 0.9465 | 0.9811 |
0.0658 | 2.5817 | 1580 | 0.0709 | 0.9319 | 0.9593 | 0.9454 | 0.9807 |
0.0674 | 2.6144 | 1600 | 0.0669 | 0.9388 | 0.9566 | 0.9476 | 0.9813 |
0.0618 | 2.6471 | 1620 | 0.0724 | 0.9287 | 0.9537 | 0.9410 | 0.9793 |
0.0588 | 2.6797 | 1640 | 0.0684 | 0.9425 | 0.9516 | 0.9471 | 0.9810 |
0.0607 | 2.7124 | 1660 | 0.0734 | 0.9349 | 0.9587 | 0.9466 | 0.9802 |
0.0648 | 2.7451 | 1680 | 0.0697 | 0.9302 | 0.9521 | 0.9410 | 0.9801 |
0.0461 | 2.7778 | 1700 | 0.0788 | 0.9218 | 0.9574 | 0.9392 | 0.9783 |
0.0696 | 2.8105 | 1720 | 0.0701 | 0.9346 | 0.9582 | 0.9463 | 0.9811 |
0.075 | 2.8431 | 1740 | 0.0719 | 0.9321 | 0.9588 | 0.9453 | 0.9797 |
0.0535 | 2.8758 | 1760 | 0.0736 | 0.9319 | 0.9570 | 0.9443 | 0.9796 |
0.05 | 2.9085 | 1780 | 0.0712 | 0.9358 | 0.9558 | 0.9457 | 0.9802 |
0.0574 | 2.9412 | 1800 | 0.0719 | 0.9291 | 0.9579 | 0.9432 | 0.9797 |
0.0591 | 2.9739 | 1820 | 0.0675 | 0.9432 | 0.9548 | 0.9490 | 0.9816 |
0.0447 | 3.0065 | 1840 | 0.0681 | 0.9382 | 0.9609 | 0.9494 | 0.9820 |
0.0514 | 3.0392 | 1860 | 0.0694 | 0.9352 | 0.9554 | 0.9452 | 0.9810 |
0.0359 | 3.0719 | 1880 | 0.0691 | 0.9353 | 0.9545 | 0.9448 | 0.9811 |
0.0428 | 3.1046 | 1900 | 0.0693 | 0.9381 | 0.9577 | 0.9478 | 0.9816 |
0.0518 | 3.1373 | 1920 | 0.0760 | 0.9304 | 0.9583 | 0.9442 | 0.9795 |
0.0514 | 3.1699 | 1940 | 0.0682 | 0.9379 | 0.9601 | 0.9489 | 0.9821 |
0.0357 | 3.2026 | 1960 | 0.0709 | 0.9360 | 0.9537 | 0.9447 | 0.9804 |
0.0455 | 3.2353 | 1980 | 0.0716 | 0.9322 | 0.9601 | 0.9460 | 0.9807 |
0.0476 | 3.2680 | 2000 | 0.0723 | 0.9320 | 0.9546 | 0.9432 | 0.9801 |
0.0339 | 3.3007 | 2020 | 0.0699 | 0.9366 | 0.9576 | 0.9470 | 0.9812 |
0.0562 | 3.3333 | 2040 | 0.0710 | 0.9314 | 0.9527 | 0.9420 | 0.9799 |
0.0466 | 3.3660 | 2060 | 0.0667 | 0.9422 | 0.9599 | 0.9510 | 0.9825 |
0.0468 | 3.3987 | 2080 | 0.0705 | 0.9347 | 0.9615 | 0.9479 | 0.9812 |
0.0443 | 3.4314 | 2100 | 0.0670 | 0.9412 | 0.9612 | 0.9511 | 0.9830 |
0.056 | 3.4641 | 2120 | 0.0691 | 0.9357 | 0.9567 | 0.9461 | 0.9811 |
0.0408 | 3.4967 | 2140 | 0.0707 | 0.9338 | 0.9595 | 0.9465 | 0.9808 |
0.0423 | 3.5294 | 2160 | 0.0700 | 0.9341 | 0.9563 | 0.9451 | 0.9810 |
0.0434 | 3.5621 | 2180 | 0.0681 | 0.9370 | 0.9510 | 0.9440 | 0.9813 |
0.0409 | 3.5948 | 2200 | 0.0704 | 0.9309 | 0.9533 | 0.9420 | 0.9803 |
0.0307 | 3.6275 | 2220 | 0.0667 | 0.9406 | 0.9589 | 0.9497 | 0.9825 |
0.0338 | 3.6601 | 2240 | 0.0665 | 0.9434 | 0.9579 | 0.9506 | 0.9829 |
0.0515 | 3.6928 | 2260 | 0.0689 | 0.9377 | 0.9550 | 0.9463 | 0.9814 |
0.0468 | 3.7255 | 2280 | 0.0685 | 0.9404 | 0.9577 | 0.9490 | 0.9821 |
0.0425 | 3.7582 | 2300 | 0.0687 | 0.9428 | 0.9606 | 0.9516 | 0.9827 |
0.0374 | 3.7908 | 2320 | 0.0681 | 0.9425 | 0.9611 | 0.9517 | 0.9828 |
0.0537 | 3.8235 | 2340 | 0.0691 | 0.9403 | 0.9617 | 0.9509 | 0.9824 |
0.0502 | 3.8562 | 2360 | 0.0695 | 0.9391 | 0.9609 | 0.9498 | 0.9822 |
0.0459 | 3.8889 | 2380 | 0.0700 | 0.9390 | 0.9611 | 0.9499 | 0.9820 |
0.0518 | 3.9216 | 2400 | 0.0709 | 0.9381 | 0.9612 | 0.9495 | 0.9817 |
0.0421 | 3.9542 | 2420 | 0.0704 | 0.9392 | 0.9603 | 0.9496 | 0.9818 |
0.0528 | 3.9869 | 2440 | 0.0703 | 0.9398 | 0.9601 | 0.9499 | 0.9819 |
Framework versions
- PEFT 0.13.2
- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
- Downloads last month
- 71
Model tree for jamesopeth/xml-roberta-large-ner-qlorafinetune-runs-colab-16size
Base model
FacebookAI/xlm-roberta-large