bertimbau-base-finetuned-brazilian_court_decisions_bt16_ep15
This model is a fine-tuned version of neuralmind/bert-base-portuguese-cased on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.523958
- Accuracy: 0.772277
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
Training results
Epoch |
Training Loss |
Validation Loss |
Accuracy |
1 |
No log |
0.852318 |
0.603960 |
2 |
No log |
0.728222 |
0.660891 |
3 |
0.781100 |
0.662818 |
0.742574 |
4 |
0.781100 |
0.687966 |
0.742574 |
5 |
0.399400 |
0.727256 |
0.762376 |
6 |
0.399400 |
0.843507 |
0.762376 |
7 |
0.399400 |
0.936927 |
0.759901 |
8 |
0.182400 |
1.065885 |
0.769802 |
9 |
0.182400 |
1.154641 |
0.754950 |
10 |
0.082200 |
1.375061 |
0.745050 |
11 |
0.082200 |
1.377540 |
0.757426 |
12 |
0.082200 |
1.465057 |
0.759901 |
13 |
0.033800 |
1.497934 |
0.762376 |
14 |
0.033800 |
1.504722 |
0.769802 |
15 |
0.017900 |
1.523958 |
0.772277 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.12.1+cu113
- Datasets 2.4.0
- Tokenizers 0.12.1