layoutlmv2-sroie-finetunev1
This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1271
- Address: {'precision': 0.9885554425228891, 'recall': 0.9948809828512926, 'f1': 0.9917081260364842, 'number': 3907}
- Company: {'precision': 0.974934036939314, 'recall': 0.9912810194500336, 'f1': 0.9830395743265714, 'number': 1491}
- Date: {'precision': 0.9952830188679245, 'recall': 0.985981308411215, 'f1': 0.9906103286384976, 'number': 428}
- Total: {'precision': 0.8826666666666667, 'recall': 0.8921832884097035, 'f1': 0.8873994638069707, 'number': 371}
- Overall Precision: 0.9794
- Overall Recall: 0.9873
- Overall F1: 0.9833
- Overall Accuracy: 0.9949
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- label_smoothing_factor: 0.02
Training results
Training Loss | Epoch | Step | Validation Loss | Address | Company | Date | Total | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|
0.2409 | 1.0 | 40 | 0.1537 | {'precision': 0.9862804878048781, 'recall': 0.9936012285641157, 'f1': 0.9899273237281654, 'number': 3907} | {'precision': 0.908923076923077, 'recall': 0.9906103286384976, 'f1': 0.94801026957638, 'number': 1491} | {'precision': 0.9414414414414415, 'recall': 0.9766355140186916, 'f1': 0.9587155963302753, 'number': 428} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 371} | 0.9620 | 0.9322 | 0.9469 | 0.9843 |
0.1402 | 2.0 | 80 | 0.1343 | {'precision': 0.9860476915271436, 'recall': 0.9948809828512926, 'f1': 0.9904446426296343, 'number': 3907} | {'precision': 0.946257197696737, 'recall': 0.9919517102615694, 'f1': 0.9685658153241651, 'number': 1491} | {'precision': 0.9813519813519813, 'recall': 0.9836448598130841, 'f1': 0.9824970828471412, 'number': 428} | {'precision': 0.6899038461538461, 'recall': 0.7735849056603774, 'f1': 0.7293519695044473, 'number': 371} | 0.9565 | 0.9802 | 0.9682 | 0.9903 |
0.1259 | 3.0 | 120 | 0.1262 | {'precision': 0.9918200408997955, 'recall': 0.9930893268492449, 'f1': 0.9924542780406702, 'number': 3907} | {'precision': 0.9800266311584553, 'recall': 0.9872568745808182, 'f1': 0.9836284664216505, 'number': 1491} | {'precision': 0.9928741092636579, 'recall': 0.9766355140186916, 'f1': 0.9846878680800941, 'number': 428} | {'precision': 0.819672131147541, 'recall': 0.8086253369272237, 'f1': 0.814111261872456, 'number': 371} | 0.9789 | 0.9795 | 0.9792 | 0.9937 |
0.1198 | 4.0 | 160 | 0.1245 | {'precision': 0.9913309535951046, 'recall': 0.9951369337087279, 'f1': 0.9932302976114447, 'number': 3907} | {'precision': 0.9774535809018567, 'recall': 0.98859825620389, 'f1': 0.9829943314438147, 'number': 1491} | {'precision': 0.997624703087886, 'recall': 0.9813084112149533, 'f1': 0.9893992932862191, 'number': 428} | {'precision': 0.7985257985257985, 'recall': 0.876010781671159, 'f1': 0.8354755784061697, 'number': 371} | 0.9759 | 0.9855 | 0.9807 | 0.9941 |
0.1168 | 5.0 | 200 | 0.1249 | {'precision': 0.9918242207460398, 'recall': 0.9936012285641157, 'f1': 0.9927119294207902, 'number': 3907} | {'precision': 0.9679319371727748, 'recall': 0.9919517102615694, 'f1': 0.9797946339847631, 'number': 1491} | {'precision': 0.990632318501171, 'recall': 0.9883177570093458, 'f1': 0.9894736842105264, 'number': 428} | {'precision': 0.8372093023255814, 'recall': 0.8733153638814016, 'f1': 0.8548812664907651, 'number': 371} | 0.9763 | 0.9856 | 0.9810 | 0.9943 |
0.1142 | 6.0 | 240 | 0.1250 | {'precision': 0.9923175416133163, 'recall': 0.991809572562068, 'f1': 0.9920634920634921, 'number': 3907} | {'precision': 0.9813581890812251, 'recall': 0.98859825620389, 'f1': 0.9849649181423321, 'number': 1491} | {'precision': 1.0, 'recall': 0.9813084112149533, 'f1': 0.9905660377358491, 'number': 428} | {'precision': 0.8802228412256268, 'recall': 0.8517520215633423, 'f1': 0.8657534246575341, 'number': 371} | 0.9837 | 0.9819 | 0.9828 | 0.9948 |
0.113 | 7.0 | 280 | 0.1244 | {'precision': 0.9908139831589691, 'recall': 0.993857179421551, 'f1': 0.9923332481472016, 'number': 3907} | {'precision': 0.9788079470198675, 'recall': 0.9912810194500336, 'f1': 0.9850049983338888, 'number': 1491} | {'precision': 1.0, 'recall': 0.985981308411215, 'f1': 0.9929411764705882, 'number': 428} | {'precision': 0.9054441260744985, 'recall': 0.8517520215633423, 'f1': 0.8777777777777778, 'number': 371} | 0.9837 | 0.9842 | 0.9839 | 0.9952 |
0.112 | 8.0 | 320 | 0.1259 | {'precision': 0.988552531162554, 'recall': 0.9946250319938572, 'f1': 0.9915794845623884, 'number': 3907} | {'precision': 0.9730617608409987, 'recall': 0.9932930918846412, 'f1': 0.9830733488217724, 'number': 1491} | {'precision': 1.0, 'recall': 0.9836448598130841, 'f1': 0.9917550058892814, 'number': 428} | {'precision': 0.8663101604278075, 'recall': 0.8733153638814016, 'f1': 0.8697986577181209, 'number': 371} | 0.9782 | 0.9863 | 0.9822 | 0.9946 |
0.1105 | 9.0 | 360 | 0.1262 | {'precision': 0.9880559085133418, 'recall': 0.9951369337087279, 'f1': 0.991583779648049, 'number': 3907} | {'precision': 0.9788219722038385, 'recall': 0.9919517102615694, 'f1': 0.9853431045969354, 'number': 1491} | {'precision': 1.0, 'recall': 0.985981308411215, 'f1': 0.9929411764705882, 'number': 428} | {'precision': 0.8895027624309392, 'recall': 0.8679245283018868, 'f1': 0.878581173260573, 'number': 371} | 0.9809 | 0.9861 | 0.9835 | 0.9950 |
0.1102 | 10.0 | 400 | 0.1258 | {'precision': 0.9905346635968278, 'recall': 0.991041719989762, 'f1': 0.9907881269191403, 'number': 3907} | {'precision': 0.9710716633793557, 'recall': 0.9906103286384976, 'f1': 0.9807436918990704, 'number': 1491} | {'precision': 0.9976415094339622, 'recall': 0.9883177570093458, 'f1': 0.9929577464788731, 'number': 428} | {'precision': 0.8605263157894737, 'recall': 0.8814016172506739, 'f1': 0.8708388814913449, 'number': 371} | 0.9783 | 0.9842 | 0.9813 | 0.9945 |
0.1091 | 11.0 | 440 | 0.1263 | {'precision': 0.990316004077472, 'recall': 0.9946250319938572, 'f1': 0.9924658408887755, 'number': 3907} | {'precision': 0.9724409448818898, 'recall': 0.993963782696177, 'f1': 0.9830845771144279, 'number': 1491} | {'precision': 1.0, 'recall': 0.985981308411215, 'f1': 0.9929411764705882, 'number': 428} | {'precision': 0.8491048593350383, 'recall': 0.894878706199461, 'f1': 0.8713910761154856, 'number': 371} | 0.9778 | 0.9879 | 0.9828 | 0.9948 |
0.1092 | 12.0 | 480 | 0.1277 | {'precision': 0.9885437881873728, 'recall': 0.993857179421551, 'f1': 0.991193363114231, 'number': 3907} | {'precision': 0.965472312703583, 'recall': 0.993963782696177, 'f1': 0.9795109054857898, 'number': 1491} | {'precision': 0.9976359338061466, 'recall': 0.985981308411215, 'f1': 0.991774383078731, 'number': 428} | {'precision': 0.8907103825136612, 'recall': 0.8787061994609164, 'f1': 0.8846675712347355, 'number': 371} | 0.9778 | 0.9864 | 0.9821 | 0.9946 |
0.1082 | 13.0 | 520 | 0.1271 | {'precision': 0.9890501655207538, 'recall': 0.9941131302789864, 'f1': 0.9915751850906306, 'number': 3907} | {'precision': 0.9794019933554817, 'recall': 0.98859825620389, 'f1': 0.9839786381842456, 'number': 1491} | {'precision': 0.9952830188679245, 'recall': 0.985981308411215, 'f1': 0.9906103286384976, 'number': 428} | {'precision': 0.8477157360406091, 'recall': 0.9002695417789758, 'f1': 0.8732026143790849, 'number': 371} | 0.9782 | 0.9866 | 0.9824 | 0.9947 |
0.1079 | 14.0 | 560 | 0.1274 | {'precision': 0.9888040712468193, 'recall': 0.9946250319938572, 'f1': 0.991706009952788, 'number': 3907} | {'precision': 0.974934036939314, 'recall': 0.9912810194500336, 'f1': 0.9830395743265714, 'number': 1491} | {'precision': 0.9952830188679245, 'recall': 0.985981308411215, 'f1': 0.9906103286384976, 'number': 428} | {'precision': 0.8691099476439791, 'recall': 0.894878706199461, 'f1': 0.8818061088977424, 'number': 371} | 0.9786 | 0.9873 | 0.9829 | 0.9948 |
0.1076 | 15.0 | 600 | 0.1268 | {'precision': 0.9887983706720977, 'recall': 0.9941131302789864, 'f1': 0.9914486279514996, 'number': 3907} | {'precision': 0.9749009247027741, 'recall': 0.9899396378269618, 'f1': 0.9823627287853578, 'number': 1491} | {'precision': 0.9976359338061466, 'recall': 0.985981308411215, 'f1': 0.991774383078731, 'number': 428} | {'precision': 0.8840970350404312, 'recall': 0.8840970350404312, 'f1': 0.8840970350404312, 'number': 371} | 0.9798 | 0.9860 | 0.9829 | 0.9948 |
0.1076 | 16.0 | 640 | 0.1268 | {'precision': 0.988552531162554, 'recall': 0.9946250319938572, 'f1': 0.9915794845623884, 'number': 3907} | {'precision': 0.97556142668428, 'recall': 0.9906103286384976, 'f1': 0.9830282861896837, 'number': 1491} | {'precision': 0.9976359338061466, 'recall': 0.985981308411215, 'f1': 0.991774383078731, 'number': 428} | {'precision': 0.8934426229508197, 'recall': 0.8814016172506739, 'f1': 0.8873812754409769, 'number': 371} | 0.9804 | 0.9863 | 0.9833 | 0.9949 |
0.1073 | 17.0 | 680 | 0.1268 | {'precision': 0.9895541401273885, 'recall': 0.9941131302789864, 'f1': 0.9918283963227783, 'number': 3907} | {'precision': 0.974917491749175, 'recall': 0.9906103286384976, 'f1': 0.9827012641383899, 'number': 1491} | {'precision': 0.9976359338061466, 'recall': 0.985981308411215, 'f1': 0.991774383078731, 'number': 428} | {'precision': 0.8921832884097035, 'recall': 0.8921832884097035, 'f1': 0.8921832884097035, 'number': 371} | 0.9808 | 0.9866 | 0.9837 | 0.9950 |
0.1071 | 18.0 | 720 | 0.1265 | {'precision': 0.9895568008150789, 'recall': 0.9943690811364219, 'f1': 0.9919571045576407, 'number': 3907} | {'precision': 0.9761904761904762, 'recall': 0.9899396378269618, 'f1': 0.983016983016983, 'number': 1491} | {'precision': 0.9952830188679245, 'recall': 0.985981308411215, 'f1': 0.9906103286384976, 'number': 428} | {'precision': 0.8873994638069705, 'recall': 0.8921832884097035, 'f1': 0.8897849462365591, 'number': 371} | 0.9806 | 0.9866 | 0.9836 | 0.9950 |
0.1072 | 19.0 | 760 | 0.1271 | {'precision': 0.9885554425228891, 'recall': 0.9948809828512926, 'f1': 0.9917081260364842, 'number': 3907} | {'precision': 0.974934036939314, 'recall': 0.9912810194500336, 'f1': 0.9830395743265714, 'number': 1491} | {'precision': 0.9952830188679245, 'recall': 0.985981308411215, 'f1': 0.9906103286384976, 'number': 428} | {'precision': 0.8756613756613757, 'recall': 0.8921832884097035, 'f1': 0.8838451268357811, 'number': 371} | 0.9789 | 0.9873 | 0.9830 | 0.9948 |
0.1072 | 20.0 | 800 | 0.1271 | {'precision': 0.9885554425228891, 'recall': 0.9948809828512926, 'f1': 0.9917081260364842, 'number': 3907} | {'precision': 0.974934036939314, 'recall': 0.9912810194500336, 'f1': 0.9830395743265714, 'number': 1491} | {'precision': 0.9952830188679245, 'recall': 0.985981308411215, 'f1': 0.9906103286384976, 'number': 428} | {'precision': 0.8826666666666667, 'recall': 0.8921832884097035, 'f1': 0.8873994638069707, 'number': 371} | 0.9794 | 0.9873 | 0.9833 | 0.9949 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.13.3
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.