js-fake-bach-epochs50

This model is a fine-tuned version of gpt2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9888
  • Accuracy: 0.0005

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0006058454513356471
  • train_batch_size: 16
  • eval_batch_size: 32
  • seed: 1
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.01
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.3512 1.25 315 0.8371 0.0003
0.8149 2.51 630 0.7684 0.0006
0.7601 3.76 945 0.7187 0.0004
0.7186 5.02 1260 0.6903 0.0002
0.679 6.27 1575 0.6563 0.0005
0.6419 7.53 1890 0.6292 0.0001
0.6073 8.78 2205 0.5949 0.0006
0.575 10.04 2520 0.5828 0.0001
0.5425 11.29 2835 0.5696 0.0003
0.5174 12.55 3150 0.5609 0.0007
0.4933 13.8 3465 0.5576 0.0004
0.4696 15.06 3780 0.5661 0.0002
0.4423 16.31 4095 0.5708 0.0007
0.4196 17.57 4410 0.5780 0.0006
0.398 18.82 4725 0.5820 0.0009
0.374 20.08 5040 0.6099 0.0003
0.3452 21.33 5355 0.6230 0.0006
0.3256 22.59 5670 0.6386 0.0005
0.3047 23.84 5985 0.6462 0.0003
0.2812 25.1 6300 0.6789 0.0003
0.2582 26.35 6615 0.7053 0.0007
0.2406 27.61 6930 0.7199 0.0006
0.2237 28.86 7245 0.7399 0.0006
0.204 30.12 7560 0.7729 0.0006
0.1873 31.37 7875 0.7960 0.0004
0.1725 32.63 8190 0.8231 0.0005
0.1609 33.88 8505 0.8493 0.0004
0.1479 35.14 8820 0.8707 0.0003
0.1361 36.39 9135 0.8931 0.0003
0.1273 37.65 9450 0.9095 0.0003
0.12 38.9 9765 0.9339 0.0005
0.1129 40.16 10080 0.9444 0.0004
0.1062 41.41 10395 0.9626 0.0006
0.1027 42.67 10710 0.9669 0.0006
0.0994 43.92 11025 0.9713 0.0005
0.0955 45.18 11340 0.9830 0.0005
0.0939 46.43 11655 0.9855 0.0005
0.0924 47.69 11970 0.9884 0.0005
0.0916 48.94 12285 0.9888 0.0005

Framework versions

  • Transformers 4.29.1
  • Pytorch 2.0.0+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
494
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for juancopi81/js-fake-bach-epochs50

Finetuned
(1265)
this model