AIYIYA/my_wr3

This model is a fine-tuned version of bert-base-chinese on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 1.1315
  • Validation Loss: 1.1418
  • Train Accuracy: 0.8158
  • Epoch: 14

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': False, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 90, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Validation Loss Train Accuracy Epoch
3.0206 2.6776 0.2895 0
2.6896 2.4286 0.7105 1
2.4102 2.1955 0.6579 2
2.1850 1.9989 0.7368 3
1.9867 1.8181 0.6842 4
1.8059 1.6320 0.7368 5
1.5830 1.5359 0.8158 6
1.5184 1.4081 0.7895 7
1.4472 1.3072 0.8421 8
1.3197 1.2605 0.8158 9
1.2258 1.2182 0.8158 10
1.2182 1.1752 0.8158 11
1.1015 1.1583 0.8158 12
1.1387 1.1463 0.8158 13
1.1315 1.1418 0.8158 14

Framework versions

  • Transformers 4.31.0
  • TensorFlow 2.12.0
  • Datasets 2.14.4
  • Tokenizers 0.13.3
Downloads last month
13
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for AIYIYA/my_wr3

Finetuned
(153)
this model