xml-roberta-large-ner-qlorafinetune-runs-colab-32size

This model is a fine-tuned version of FacebookAI/xlm-roberta-large on the biobert_json dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0818
  • Precision: 0.9372
  • Recall: 0.9585
  • F1: 0.9477
  • Accuracy: 0.9810

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use paged_adamw_8bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • training_steps: 2141
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
2.0776 0.0654 20 1.3595 0.0 0.0 0.0 0.7180
1.4405 0.1307 40 1.3697 0.0 0.0 0.0 0.7180
1.4064 0.1961 60 1.3472 0.0 0.0 0.0 0.7180
1.3812 0.2614 80 1.2757 0.0 0.0 0.0 0.7180
1.1716 0.3268 100 0.6756 0.5890 0.4444 0.5066 0.8274
0.7189 0.3922 120 0.3631 0.7186 0.7394 0.7289 0.8974
0.4042 0.4575 140 0.2416 0.7440 0.8402 0.7892 0.9282
0.26 0.5229 160 0.1593 0.8510 0.8888 0.8695 0.9534
0.2169 0.5882 180 0.1403 0.8604 0.9057 0.8824 0.9569
0.1684 0.6536 200 0.1218 0.8735 0.9102 0.8915 0.9623
0.1719 0.7190 220 0.1043 0.8879 0.9337 0.9102 0.9688
0.1641 0.7843 240 0.1217 0.8751 0.9479 0.9101 0.9643
0.1358 0.8497 260 0.1017 0.9015 0.9369 0.9189 0.9710
0.1485 0.9150 280 0.1224 0.8774 0.9504 0.9124 0.9653
0.1477 0.9804 300 0.0945 0.9156 0.9391 0.9272 0.9730
0.1227 1.0458 320 0.0975 0.8964 0.9442 0.9197 0.9717
0.1076 1.1111 340 0.0860 0.9186 0.9470 0.9326 0.9748
0.1068 1.1765 360 0.0972 0.9042 0.9483 0.9257 0.9721
0.1089 1.2418 380 0.0927 0.9225 0.9283 0.9254 0.9729
0.1135 1.3072 400 0.0822 0.9263 0.9464 0.9362 0.9768
0.1121 1.3725 420 0.1034 0.8846 0.9224 0.9031 0.9678
0.1019 1.4379 440 0.0850 0.9233 0.9451 0.9341 0.9768
0.0908 1.5033 460 0.0733 0.9293 0.9525 0.9408 0.9783
0.1003 1.5686 480 0.0908 0.9119 0.9516 0.9314 0.9730
0.0922 1.6340 500 0.0872 0.9153 0.9543 0.9344 0.9758
0.0815 1.6993 520 0.0851 0.9183 0.9623 0.9398 0.9775
0.0931 1.7647 540 0.0837 0.9229 0.9455 0.9341 0.9775
0.0922 1.8301 560 0.0853 0.9149 0.9606 0.9372 0.9767
0.0953 1.8954 580 0.0899 0.9132 0.9612 0.9366 0.9753
0.0892 1.9608 600 0.0809 0.9230 0.9544 0.9384 0.9776
0.0772 2.0261 620 0.0822 0.9244 0.9580 0.9409 0.9775
0.0744 2.0915 640 0.0810 0.9225 0.9585 0.9401 0.9776
0.0558 2.1569 660 0.0758 0.9281 0.9581 0.9429 0.9795
0.0759 2.2222 680 0.0794 0.9244 0.9566 0.9402 0.9778
0.0778 2.2876 700 0.0914 0.9115 0.9527 0.9316 0.9752
0.0672 2.3529 720 0.0819 0.9285 0.9497 0.9390 0.9790
0.0708 2.4183 740 0.0823 0.9233 0.9537 0.9382 0.9760
0.0906 2.4837 760 0.0732 0.9317 0.9577 0.9445 0.9803
0.066 2.5490 780 0.0767 0.9319 0.9568 0.9442 0.9799
0.074 2.6144 800 0.0730 0.9359 0.9524 0.9441 0.9797
0.0636 2.6797 820 0.0720 0.9406 0.9536 0.9470 0.9808
0.0687 2.7451 840 0.0758 0.9294 0.9531 0.9411 0.9793
0.0611 2.8105 860 0.0777 0.9327 0.9607 0.9465 0.9803
0.0766 2.8758 880 0.0759 0.9239 0.9564 0.9399 0.9784
0.0625 2.9412 900 0.0814 0.9157 0.9593 0.9370 0.9775
0.0619 3.0065 920 0.0757 0.9332 0.9586 0.9457 0.9803
0.0482 3.0719 940 0.0733 0.9310 0.9593 0.9449 0.9812
0.0531 3.1373 960 0.0862 0.9258 0.9618 0.9435 0.9780
0.0507 3.2026 980 0.0733 0.9326 0.9561 0.9442 0.9803
0.0597 3.2680 1000 0.0758 0.9313 0.9562 0.9436 0.9793
0.0527 3.3333 1020 0.0765 0.9256 0.9533 0.9393 0.9790
0.0561 3.3987 1040 0.0854 0.9169 0.9643 0.9400 0.9775
0.0584 3.4641 1060 0.0802 0.9251 0.9612 0.9428 0.9792
0.053 3.5294 1080 0.0770 0.9259 0.9494 0.9375 0.9790
0.0483 3.5948 1100 0.0792 0.9306 0.9591 0.9446 0.9791
0.041 3.6601 1120 0.0763 0.9343 0.9509 0.9425 0.9802
0.0687 3.7255 1140 0.0719 0.9410 0.9627 0.9517 0.9819
0.051 3.7908 1160 0.0715 0.9385 0.9601 0.9492 0.9819
0.0666 3.8562 1180 0.0807 0.9251 0.9577 0.9411 0.9782
0.06 3.9216 1200 0.0838 0.9256 0.9551 0.9402 0.9778
0.0635 3.9869 1220 0.0751 0.9291 0.9610 0.9448 0.9804
0.0429 4.0523 1240 0.0755 0.9380 0.9593 0.9485 0.9816
0.0379 4.1176 1260 0.0750 0.9300 0.9589 0.9443 0.9808
0.0376 4.1830 1280 0.0735 0.9366 0.9603 0.9483 0.9818
0.0454 4.2484 1300 0.0737 0.9398 0.9549 0.9473 0.9808
0.0413 4.3137 1320 0.0729 0.9416 0.9506 0.9460 0.9807
0.0458 4.3791 1340 0.0713 0.9423 0.9585 0.9503 0.9822
0.0367 4.4444 1360 0.0759 0.9325 0.9603 0.9462 0.9812
0.0305 4.5098 1380 0.0814 0.9281 0.9611 0.9443 0.9802
0.0437 4.5752 1400 0.0842 0.9280 0.9512 0.9394 0.9790
0.0469 4.6405 1420 0.0799 0.9295 0.9592 0.9441 0.9795
0.04 4.7059 1440 0.0777 0.9359 0.9600 0.9478 0.9807
0.0462 4.7712 1460 0.0812 0.9312 0.9595 0.9452 0.9803
0.042 4.8366 1480 0.0764 0.9409 0.9542 0.9475 0.9810
0.0446 4.9020 1500 0.0767 0.9372 0.9625 0.9497 0.9812
0.0515 4.9673 1520 0.0793 0.9323 0.9552 0.9436 0.9800
0.0368 5.0327 1540 0.0802 0.9338 0.9601 0.9468 0.9806
0.0349 5.0980 1560 0.0781 0.9412 0.9599 0.9505 0.9816
0.0405 5.1634 1580 0.0773 0.9403 0.9616 0.9508 0.9822
0.0381 5.2288 1600 0.0835 0.9291 0.9577 0.9432 0.9795
0.0307 5.2941 1620 0.0772 0.9399 0.9579 0.9488 0.9815
0.0295 5.3595 1640 0.0787 0.9399 0.9595 0.9496 0.9815
0.0313 5.4248 1660 0.0787 0.9432 0.9560 0.9495 0.9821
0.0411 5.4902 1680 0.0848 0.9274 0.9512 0.9391 0.9790
0.0397 5.5556 1700 0.0784 0.9392 0.9604 0.9497 0.9813
0.0346 5.6209 1720 0.0780 0.9373 0.9570 0.9471 0.9811
0.0343 5.6863 1740 0.0746 0.9416 0.9554 0.9484 0.9815
0.0327 5.7516 1760 0.0842 0.9226 0.9512 0.9366 0.9785
0.0307 5.8170 1780 0.0783 0.9366 0.9594 0.9479 0.9815
0.0399 5.8824 1800 0.0802 0.9325 0.9577 0.9450 0.9802
0.0317 5.9477 1820 0.0767 0.9397 0.9603 0.9499 0.9818
0.033 6.0131 1840 0.0802 0.9336 0.9573 0.9453 0.9805
0.0289 6.0784 1860 0.0791 0.9370 0.9560 0.9464 0.9810
0.0275 6.1438 1880 0.0799 0.9316 0.9543 0.9428 0.9802
0.0222 6.2092 1900 0.0830 0.9296 0.9552 0.9423 0.9799
0.0329 6.2745 1920 0.0796 0.9382 0.9587 0.9483 0.9815
0.0281 6.3399 1940 0.0808 0.9349 0.9580 0.9463 0.9810
0.0281 6.4052 1960 0.0794 0.9370 0.9579 0.9473 0.9814
0.0231 6.4706 1980 0.0814 0.9332 0.9572 0.9450 0.9808
0.0251 6.5359 2000 0.0808 0.9360 0.9588 0.9473 0.9813
0.0294 6.6013 2020 0.0808 0.9368 0.9586 0.9476 0.9812
0.023 6.6667 2040 0.0825 0.9353 0.9581 0.9466 0.9807
0.0271 6.7320 2060 0.0811 0.9384 0.9597 0.9489 0.9814
0.0353 6.7974 2080 0.0803 0.9401 0.9601 0.9500 0.9818
0.0243 6.8627 2100 0.0814 0.9379 0.9581 0.9479 0.9810
0.0264 6.9281 2120 0.0821 0.9369 0.9583 0.9475 0.9809
0.0241 6.9935 2140 0.0818 0.9372 0.9585 0.9477 0.9810

Framework versions

  • PEFT 0.13.2
  • Transformers 4.46.3
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.20.3
Downloads last month
13
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for brandonRivas/xml-roberta-large-ner-qlorafinetune-runs-colab-32size

Adapter
(22)
this model