lmv2-g-passport-197-doc-09-13

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0438
  • Country Code Precision: 0.9412
  • Country Code Recall: 0.9697
  • Country Code F1: 0.9552
  • Country Code Number: 33
  • Date Of Birth Precision: 0.9714
  • Date Of Birth Recall: 1.0
  • Date Of Birth F1: 0.9855
  • Date Of Birth Number: 34
  • Date Of Expiry Precision: 1.0
  • Date Of Expiry Recall: 1.0
  • Date Of Expiry F1: 1.0
  • Date Of Expiry Number: 36
  • Date Of Issue Precision: 1.0
  • Date Of Issue Recall: 1.0
  • Date Of Issue F1: 1.0
  • Date Of Issue Number: 36
  • Given Name Precision: 0.9444
  • Given Name Recall: 1.0
  • Given Name F1: 0.9714
  • Given Name Number: 34
  • Nationality Precision: 0.9714
  • Nationality Recall: 1.0
  • Nationality F1: 0.9855
  • Nationality Number: 34
  • Passport No Precision: 0.9118
  • Passport No Recall: 0.9688
  • Passport No F1: 0.9394
  • Passport No Number: 32
  • Place Of Birth Precision: 1.0
  • Place Of Birth Recall: 0.9730
  • Place Of Birth F1: 0.9863
  • Place Of Birth Number: 37
  • Place Of Issue Precision: 1.0
  • Place Of Issue Recall: 0.9722
  • Place Of Issue F1: 0.9859
  • Place Of Issue Number: 36
  • Sex Precision: 0.9655
  • Sex Recall: 0.9333
  • Sex F1: 0.9492
  • Sex Number: 30
  • Surname Precision: 0.9259
  • Surname Recall: 1.0
  • Surname F1: 0.9615
  • Surname Number: 25
  • Type Precision: 1.0
  • Type Recall: 1.0
  • Type F1: 1.0
  • Type Number: 27
  • Overall Precision: 0.97
  • Overall Recall: 0.9848
  • Overall F1: 0.9773
  • Overall Accuracy: 0.9941

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: constant
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Country Code Precision Country Code Recall Country Code F1 Country Code Number Date Of Birth Precision Date Of Birth Recall Date Of Birth F1 Date Of Birth Number Date Of Expiry Precision Date Of Expiry Recall Date Of Expiry F1 Date Of Expiry Number Date Of Issue Precision Date Of Issue Recall Date Of Issue F1 Date Of Issue Number Given Name Precision Given Name Recall Given Name F1 Given Name Number Nationality Precision Nationality Recall Nationality F1 Nationality Number Passport No Precision Passport No Recall Passport No F1 Passport No Number Place Of Birth Precision Place Of Birth Recall Place Of Birth F1 Place Of Birth Number Place Of Issue Precision Place Of Issue Recall Place Of Issue F1 Place Of Issue Number Sex Precision Sex Recall Sex F1 Sex Number Surname Precision Surname Recall Surname F1 Surname Number Type Precision Type Recall Type F1 Type Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.6757 1.0 157 1.2569 0.0 0.0 0.0 33 0.0 0.0 0.0 34 0.2466 1.0 0.3956 36 0.0 0.0 0.0 36 0.0 0.0 0.0 34 0.0 0.0 0.0 34 0.0 0.0 0.0 32 0.0 0.0 0.0 37 0.0 0.0 0.0 36 0.0 0.0 0.0 30 0.0 0.0 0.0 25 0.0 0.0 0.0 27 0.2466 0.0914 0.1333 0.8446
0.9214 2.0 314 0.5683 0.9394 0.9394 0.9394 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.5625 0.5294 0.5455 34 0.9714 1.0 0.9855 34 0.6098 0.7812 0.6849 32 0.9394 0.8378 0.8857 37 0.8293 0.9444 0.8831 36 1.0 0.9333 0.9655 30 0.6129 0.76 0.6786 25 1.0 0.8889 0.9412 27 0.8642 0.8883 0.8761 0.9777
0.4452 3.0 471 0.3266 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.5556 0.4412 0.4918 34 0.9714 1.0 0.9855 34 0.625 0.7812 0.6944 32 1.0 0.8108 0.8955 37 0.7556 0.9444 0.8395 36 0.9655 0.9333 0.9492 30 0.5556 0.8 0.6557 25 1.0 0.7037 0.8261 27 0.8532 0.8706 0.8618 0.9784
0.2823 4.0 628 0.2215 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.75 0.8824 0.8108 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.8378 0.9118 37 0.9459 0.9722 0.9589 36 0.9333 0.9333 0.9333 30 0.75 0.96 0.8421 25 1.0 0.9630 0.9811 27 0.9286 0.9569 0.9425 0.9885
0.2092 5.0 785 0.1633 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.8889 0.9412 0.9143 34 0.9714 1.0 0.9855 34 0.8857 0.9688 0.9254 32 1.0 0.8649 0.9275 37 0.8974 0.9722 0.9333 36 1.0 0.9333 0.9655 30 0.8889 0.96 0.9231 25 1.0 1.0 1.0 27 0.9525 0.9670 0.9597 0.9918
0.1593 6.0 942 0.1331 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 0.9730 1.0 0.9863 36 0.8857 0.9118 0.8986 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 0.9722 0.9459 0.9589 37 0.9722 0.9722 0.9722 36 1.0 0.9 0.9474 30 0.8571 0.96 0.9057 25 1.0 0.9630 0.9811 27 0.9549 0.9670 0.9609 0.9908
0.1288 7.0 1099 0.1064 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9444 1.0 0.9714 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 0.9722 0.9859 36 1.0 0.9333 0.9655 30 0.92 0.92 0.92 25 1.0 1.0 1.0 27 0.9723 0.9797 0.9760 0.9941
0.1035 8.0 1256 0.1043 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9706 0.9706 0.9706 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 0.9231 0.9730 0.9474 37 0.75 1.0 0.8571 36 0.9032 0.9333 0.9180 30 0.6486 0.96 0.7742 25 1.0 1.0 1.0 27 0.9085 0.9822 0.9439 0.9856
0.0843 9.0 1413 0.0823 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9143 0.9412 0.9275 34 0.9714 1.0 0.9855 34 0.9394 0.9688 0.9538 32 0.9032 0.7568 0.8235 37 0.9211 0.9722 0.9459 36 0.9655 0.9333 0.9492 30 0.7059 0.96 0.8136 25 1.0 1.0 1.0 27 0.9355 0.9569 0.9460 0.9905
0.0733 10.0 1570 0.0738 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9714 1.0 0.9855 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 0.9459 0.9459 0.9459 37 1.0 0.9444 0.9714 36 0.8485 0.9333 0.8889 30 0.8333 1.0 0.9091 25 0.9643 1.0 0.9818 27 0.9484 0.9797 0.9638 0.9911
0.0614 11.0 1727 0.0661 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9714 1.0 0.9855 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 0.9459 0.9459 0.9459 37 1.0 0.9722 0.9859 36 0.9655 0.9333 0.9492 30 0.9231 0.96 0.9412 25 1.0 0.9630 0.9811 27 0.9673 0.9772 0.9722 0.9934
0.0548 12.0 1884 0.0637 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 0.9730 1.0 0.9863 36 0.9167 0.9706 0.9429 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 0.9459 0.9459 0.9459 37 1.0 0.9722 0.9859 36 0.875 0.9333 0.9032 30 0.9259 1.0 0.9615 25 0.9643 1.0 0.9818 27 0.9507 0.9797 0.965 0.9921
0.0515 13.0 2041 0.0562 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9714 1.0 0.9855 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 0.9730 0.9730 0.9730 37 1.0 1.0 1.0 36 0.9333 0.9333 0.9333 30 0.8621 1.0 0.9259 25 0.9643 1.0 0.9818 27 0.9605 0.9873 0.9737 0.9931
0.0431 14.0 2198 0.0513 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9444 1.0 0.9714 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 1.0 1.0 36 1.0 0.9333 0.9655 30 0.9231 0.96 0.9412 25 1.0 0.9630 0.9811 27 0.9724 0.9822 0.9773 0.9944
0.0413 15.0 2355 0.0582 0.9412 0.9697 0.9552 33 0.9706 0.9706 0.9706 34 0.9730 1.0 0.9863 36 0.9730 1.0 0.9863 36 0.9429 0.9706 0.9565 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 1.0 1.0 36 0.9655 0.9333 0.9492 30 0.8929 1.0 0.9434 25 1.0 1.0 1.0 27 0.9627 0.9822 0.9724 0.9934
0.035 16.0 2512 0.0556 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 0.9722 0.9859 36 0.8857 0.9118 0.8986 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 0.9730 0.9730 0.9730 37 1.0 0.9722 0.9859 36 0.9333 0.9333 0.9333 30 0.8621 1.0 0.9259 25 1.0 1.0 1.0 27 0.9552 0.9746 0.9648 0.9915
0.0316 17.0 2669 0.0517 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9167 0.9706 0.9429 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 0.9722 0.9859 36 0.875 0.9333 0.9032 30 0.8929 1.0 0.9434 25 1.0 1.0 1.0 27 0.9579 0.9822 0.9699 0.9928
0.027 18.0 2826 0.0502 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 0.9730 1.0 0.9863 36 1.0 1.0 1.0 36 0.9444 1.0 0.9714 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 0.9722 0.9859 36 0.9032 0.9333 0.9180 30 0.9259 1.0 0.9615 25 1.0 1.0 1.0 27 0.9628 0.9848 0.9737 0.9931
0.026 19.0 2983 0.0481 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9189 1.0 0.9577 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 1.0 1.0 36 0.9333 0.9333 0.9333 30 0.8333 1.0 0.9091 25 1.0 1.0 1.0 27 0.9581 0.9873 0.9725 0.9928
0.026 20.0 3140 0.0652 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 0.9730 1.0 0.9863 36 1.0 1.0 1.0 36 0.9714 1.0 0.9855 34 0.9714 1.0 0.9855 34 0.8611 0.9688 0.9118 32 0.9730 0.9730 0.9730 37 0.9730 1.0 0.9863 36 0.8235 0.9333 0.8750 30 0.8333 1.0 0.9091 25 1.0 1.0 1.0 27 0.9419 0.9873 0.9641 0.9882
0.0311 21.0 3297 0.0438 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9444 1.0 0.9714 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 0.9722 0.9859 36 0.9655 0.9333 0.9492 30 0.9259 1.0 0.9615 25 1.0 1.0 1.0 27 0.97 0.9848 0.9773 0.9941
0.0216 22.0 3454 0.0454 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9706 0.9706 0.9706 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 0.9722 0.9859 36 0.9333 0.9333 0.9333 30 0.9259 1.0 0.9615 25 1.0 1.0 1.0 27 0.9699 0.9822 0.9760 0.9941
0.0196 23.0 3611 0.0510 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9714 1.0 0.9855 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 0.8718 0.9189 0.8947 37 1.0 0.9722 0.9859 36 0.9655 0.9333 0.9492 30 0.9259 1.0 0.9615 25 1.0 1.0 1.0 27 0.9602 0.9797 0.9698 0.9934
0.0176 24.0 3768 0.0457 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9706 0.9706 0.9706 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 1.0 1.0 36 0.9333 0.9333 0.9333 30 0.8929 1.0 0.9434 25 1.0 1.0 1.0 27 0.9676 0.9848 0.9761 0.9938
0.0141 25.0 3925 0.0516 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9714 1.0 0.9855 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 0.9722 0.9459 0.9589 37 0.9730 1.0 0.9863 36 0.875 0.9333 0.9032 30 0.9231 0.96 0.9412 25 0.9643 1.0 0.9818 27 0.9579 0.9822 0.9699 0.9928
0.0129 26.0 4082 0.0508 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 0.9730 1.0 0.9863 36 1.0 1.0 1.0 36 0.9714 1.0 0.9855 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 1.0 1.0 36 0.875 0.9333 0.9032 30 0.9259 1.0 0.9615 25 1.0 1.0 1.0 27 0.9629 0.9873 0.9749 0.9934
0.0125 27.0 4239 0.0455 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9714 1.0 0.9855 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 0.9722 0.9859 36 1.0 0.9333 0.9655 30 0.9259 1.0 0.9615 25 0.8710 1.0 0.9310 27 0.9652 0.9848 0.9749 0.9934
0.0131 28.0 4396 0.0452 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 0.9722 0.9859 36 0.9429 0.9706 0.9565 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 1.0 0.9730 0.9863 37 1.0 0.9722 0.9859 36 1.0 0.9333 0.9655 30 0.9231 0.96 0.9412 25 1.0 1.0 1.0 27 0.9722 0.9772 0.9747 0.9941
0.0112 29.0 4553 0.0465 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.9714 1.0 0.9855 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 0.9459 0.9459 0.9459 37 0.9722 0.9722 0.9722 36 0.9333 0.9333 0.9333 30 0.9583 0.92 0.9388 25 1.0 1.0 1.0 27 0.9649 0.9772 0.9710 0.9931
0.0152 30.0 4710 0.0510 0.9412 0.9697 0.9552 33 0.9714 1.0 0.9855 34 1.0 1.0 1.0 36 1.0 1.0 1.0 36 0.8857 0.9118 0.8986 34 0.9714 1.0 0.9855 34 0.9118 0.9688 0.9394 32 0.9730 0.9730 0.9730 37 1.0 0.9722 0.9859 36 1.0 0.9333 0.9655 30 0.9231 0.96 0.9412 25 1.0 1.0 1.0 27 0.9648 0.9746 0.9697 0.9931

Framework versions

  • Transformers 4.22.0.dev0
  • Pytorch 1.12.1+cu113
  • Datasets 2.2.2
  • Tokenizers 0.12.1
Downloads last month
133
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.