flan-t5-small-finetuned-question-generation

This model is a fine-tuned version of google/flan-t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5998
  • Rouge1: 50.1718
  • Rouge2: 27.5603
  • Rougel: 46.3981
  • Rougelsum: 46.3975
  • Gen Len: 13.7948

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
1.819 1.0 10913 1.6159 48.8496 26.1270 45.1331 45.1442 13.8064
1.6487 2.0 21826 1.5947 48.8142 26.2209 45.1475 45.1482 13.8229
1.5546 3.0 32739 1.5910 49.6261 27.1655 45.9472 45.9535 13.9086
1.4862 4.0 43652 1.5887 49.9953 27.4630 46.2824 46.2841 13.7223
1.4327 5.0 54565 1.5950 50.1663 27.6038 46.4602 46.4721 13.7106
1.3907 6.0 65478 1.5910 49.9510 27.4795 46.2230 46.2218 13.8172
1.3598 7.0 76391 1.5973 50.1049 27.4804 46.3268 46.3300 13.7966
1.3388 8.0 87304 1.5998 50.1718 27.5603 46.3981 46.3975 13.7948

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
21
Safetensors
Model size
77M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for dantedgp/flan-t5-small-finetuned-question-generation

Quantized
(10)
this model