JP-base-clean-0215

This model is a fine-tuned version of facebook/wav2vec2-base-960h on the audiofolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0988
  • Cer: 0.012

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 3125.0
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer
5.5004 1.0 625 7.2647 1.0
4.0716 2.0 1250 4.3871 1.0
3.3302 3.0 1875 3.1038 1.0
0.8423 4.0 2500 0.9833 0.998
0.5152 5.0 3125 0.7318 0.996
0.3984 6.0 3750 0.4784 0.996
0.3481 7.0 4375 0.3688 0.994
0.3149 8.0 5000 0.3821 0.994
0.2852 9.0 5625 0.2320 0.992
0.2576 10.0 6250 0.2887 0.991
0.2423 11.0 6875 0.2071 0.991
0.2278 12.0 7500 0.1700 0.989
0.2104 13.0 8125 0.1553 0.991
0.2016 14.0 8750 0.1500 0.988
0.1967 15.0 9375 0.1357 0.985
0.1838 16.0 10000 0.1615 0.988
0.172 17.0 10625 0.1238 0.986
0.1687 18.0 11250 0.1270 0.988
0.1555 19.0 11875 0.1221 0.987
0.1532 20.0 12500 0.1168 0.988
0.1414 21.0 13125 0.1175 0.988
0.1366 22.0 13750 0.1231 0.985
0.1341 23.0 14375 0.1004 0.987
0.1273 24.0 15000 0.1175 0.984
0.1199 25.0 15625 0.1246 0.984
0.1181 26.0 16250 0.1382 0.985
0.1152 27.0 16875 0.1064 0.984
0.1116 28.0 17500 0.1075 0.985
0.1097 29.0 18125 0.1110 0.986
0.1074 30.0 18750 0.1399 0.983
0.0997 31.0 19375 0.1385 0.983
0.0998 32.0 20000 0.1185 0.983
0.0973 33.0 20625 0.1491 0.982
0.0988 34.0 21250 0.1232 0.983
0.0942 35.0 21875 0.1205 0.98
0.0949 36.0 22500 0.1109 0.981
0.0947 37.0 23125 0.1119 0.982
0.0939 38.0 23750 0.1151 0.983
0.0876 39.0 24375 0.1001 0.982
0.0893 40.0 25000 0.0957 0.984
0.0897 41.0 25625 0.0924 0.982
0.0859 42.0 26250 0.0959 0.983
0.0881 43.0 26875 0.0996 0.983
0.0885 44.0 27500 0.0972 0.982
0.0871 45.0 28125 0.0984 0.983
0.0866 46.0 28750 0.0976 0.983
0.0858 47.0 29375 0.0982 0.983
0.0882 48.0 30000 0.0982 0.983
0.0848 49.0 30625 0.0988 0.983
0.0855 50.0 31250 0.0988 0.983

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.2.0
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
24
Safetensors
Model size
94.4M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for SiRoZaRuPa/JP-base-clean-0215

Finetuned
(122)
this model

Evaluation results