Edit model card

xls-r-1b-bem-genbed-f-model

This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the GENBED - BEM dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1404
  • Wer: 1.4866

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1.5e-07
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.1704 0.5479 200 0.1738 1.0
0.1698 1.0959 400 0.1717 1.0
0.1748 1.6438 600 0.1682 1.0
0.1818 2.1918 800 0.1649 1.0
0.1991 2.7397 1000 0.1613 1.0
0.1558 3.2877 1200 0.1589 1.0
0.1501 3.8356 1400 0.1567 1.0
0.1842 4.3836 1600 0.1545 1.0
0.1521 4.9315 1800 0.1529 1.0
0.1454 5.4795 2000 0.1517 1.0
0.173 6.0274 2200 0.1505 1.0
0.1591 6.5753 2400 0.1496 1.0
0.1431 7.1233 2600 0.1489 1.0
0.1406 7.6712 2800 0.1482 1.0
0.1659 8.2192 3000 0.1475 1.0
0.1561 8.7671 3200 0.1470 1.0
0.1649 9.3151 3400 0.1464 1.0
0.1457 9.8630 3600 0.1461 1.0
0.1415 10.4110 3800 0.1457 1.0
0.1629 10.9589 4000 0.1453 1.0
0.1216 11.5068 4200 0.1450 1.0
0.1623 12.0548 4400 0.1446 1.0
0.153 12.6027 4600 0.1442 1.0
0.1516 13.1507 4800 0.1440 1.0
0.1477 13.6986 5000 0.1438 1.0010
0.1337 14.2466 5200 0.1436 1.0041
0.1384 14.7945 5400 0.1434 1.0082
0.1575 15.3425 5600 0.1430 1.0103
0.1367 15.8904 5800 0.1428 1.0124
0.1912 16.4384 6000 0.1426 1.0247
0.1194 16.9863 6200 0.1424 1.0289
0.1511 17.5342 6400 0.1422 1.0351
0.1459 18.0822 6600 0.1421 1.0608
0.1465 18.6301 6800 0.1419 1.0639
0.1531 19.1781 7000 0.1418 1.0856
0.1436 19.7260 7200 0.1416 1.1247
0.139 20.2740 7400 0.1415 1.1454
0.1459 20.8219 7600 0.1414 1.1495
0.1308 21.3699 7800 0.1413 1.2021
0.1474 21.9178 8000 0.1412 1.2186
0.1518 22.4658 8200 0.1411 1.2825
0.149 23.0137 8400 0.1410 1.2866
0.1245 23.5616 8600 0.1409 1.3381
0.1498 24.1096 8800 0.1408 1.3474
0.129 24.6575 9000 0.1408 1.3825
0.1702 25.2055 9200 0.1407 1.4134
0.1391 25.7534 9400 0.1406 1.4381
0.1461 26.3014 9600 0.1406 1.4670
0.1502 26.8493 9800 0.1405 1.4639
0.1408 27.3973 10000 0.1405 1.4794
0.1326 27.9452 10200 0.1405 1.4804
0.1422 28.4932 10400 0.1405 1.4814
0.1469 29.0411 10600 0.1404 1.4814
0.133 29.5890 10800 0.1404 1.4866

Framework versions

  • Transformers 4.45.0.dev0
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
992M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for csikasote/xls-r-1b-bem-genbed-f-model

Finetuned
this model