TrOCR Small (Finetuned on French)
This model is a fine-tuned version of microsoft/trocr-base-handwritten on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0871
- Model Preparation Time: 0.0066
- Cer: 0.0134
- Wer: 0.0350
- Ratio: 97.4287
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 12000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Cer | Wer | Ratio |
---|---|---|---|---|---|---|---|
1.359 | 0.0333 | 400 | 1.0638 | 0.0066 | 0.1715 | 0.3841 | 89.6898 |
1.1123 | 0.0667 | 800 | 1.2707 | 0.0066 | 0.1715 | 0.4437 | 89.7413 |
0.872 | 0.1 | 1200 | 0.9230 | 0.0066 | 0.1305 | 0.3576 | 91.0810 |
0.6831 | 0.1333 | 1600 | 0.7046 | 0.0066 | 0.1195 | 0.2848 | 92.7617 |
0.6659 | 0.1667 | 2000 | 0.5952 | 0.0066 | 0.0841 | 0.2450 | 94.7742 |
0.612 | 0.2 | 2400 | 0.5830 | 0.0066 | 0.1029 | 0.2517 | 93.2609 |
0.4865 | 0.2333 | 2800 | 0.5312 | 0.0066 | 0.0973 | 0.2318 | 94.4422 |
0.4922 | 0.2667 | 3200 | 0.5842 | 0.0066 | 0.0918 | 0.1987 | 95.5454 |
0.4152 | 0.3 | 3600 | 0.4215 | 0.0066 | 0.0664 | 0.1921 | 95.9747 |
0.3648 | 0.3333 | 4000 | 0.4264 | 0.0066 | 0.0608 | 0.1722 | 96.4262 |
0.3272 | 0.3667 | 4400 | 0.5209 | 0.0066 | 0.0653 | 0.1921 | 95.3742 |
0.3172 | 0.4 | 4800 | 0.4229 | 0.0066 | 0.0531 | 0.1788 | 95.8131 |
0.2672 | 0.4333 | 5200 | 0.4071 | 0.0066 | 0.0586 | 0.1921 | 96.0787 |
0.2747 | 0.4667 | 5600 | 0.3494 | 0.0066 | 0.0586 | 0.1656 | 96.2269 |
0.2576 | 0.5 | 6000 | 0.3687 | 0.0066 | 0.0642 | 0.1523 | 96.6562 |
0.2138 | 0.5333 | 6400 | 0.3945 | 0.0066 | 0.0564 | 0.1391 | 96.8775 |
0.2197 | 0.5667 | 6800 | 0.3698 | 0.0066 | 0.0420 | 0.1391 | 97.3293 |
0.1908 | 1.0141 | 7200 | 0.3288 | 0.0066 | 0.0420 | 0.1126 | 97.6500 |
0.145 | 1.0474 | 7600 | 0.2655 | 0.0066 | 0.0332 | 0.0993 | 97.6602 |
0.1347 | 1.0808 | 8000 | 0.2659 | 0.0066 | 0.0365 | 0.1258 | 97.2893 |
0.1092 | 1.1141 | 8400 | 0.2496 | 0.0066 | 0.0343 | 0.1192 | 97.6671 |
0.111 | 1.1474 | 8800 | 0.2205 | 0.0066 | 0.0221 | 0.0861 | 98.4693 |
0.1033 | 1.1807 | 9200 | 0.2226 | 0.0066 | 0.0254 | 0.0927 | 98.1761 |
0.0919 | 1.2141 | 9600 | 0.1787 | 0.0066 | 0.0210 | 0.0728 | 98.6440 |
0.074 | 1.2474 | 10000 | 0.1756 | 0.0066 | 0.0188 | 0.0464 | 99.3030 |
0.0833 | 1.2808 | 10400 | 0.1830 | 0.0066 | 0.0232 | 0.0728 | 98.8762 |
0.057 | 1.3141 | 10800 | 0.1675 | 0.0066 | 0.0133 | 0.0530 | 99.3276 |
0.038 | 1.3474 | 11200 | 0.1709 | 0.0066 | 0.0188 | 0.0596 | 98.9563 |
0.0409 | 1.3807 | 11600 | 0.1400 | 0.0066 | 0.0133 | 0.0530 | 99.2905 |
0.0404 | 1.4141 | 12000 | 0.1423 | 0.0066 | 0.0122 | 0.0464 | 99.2598 |
Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Tokenizers 0.20.3
- Downloads last month
- 12
Inference API (serverless) does not yet support transformers models for this pipeline type.
Model tree for personalizedrefrigerator/trocr-base
Base model
microsoft/trocr-base-handwritten