File size: 2,454 Bytes
621ddda 1b032dd e98ad9a 1b032dd e98ad9a 621ddda e98ad9a 5f76a9d e98ad9a 5f76a9d e98ad9a 5f76a9d e98ad9a 70cd5df |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 |
---
language:
- en
license: mit
tags:
- generated_from_trainer
datasets:
- ag_news
widget:
- text: Oil and Economy Cloud Stocks' Outlook (Reuters) Reuters - Soaring crude prices
plus worries\about the economy and the outlook for earnings are expected to\hang
over the stock market next week during the depth of the\summer doldrums
- text: Prediction Unit Helps Forecast Wildfires (AP) AP - It's barely dawn when Mike
Fitzpatrick starts his shift with a blur of colorful maps, figures and endless
charts, but already he knows what the day will bring. Lightning will strike in
places he expects. Winds will pick up, moist places will dry and flames will roar
- text: Venezuelans Flood Polls, Voting Extended CARACAS, Venezuela (Reuters) - Venezuelans
voted in huge numbers on Sunday in a historic referendum on whether to recall
left-wing President Hugo Chavez and electoral authorities prolonged voting well
into the night.
pipeline_tag: text-classification
base_model: roberta-base
model-index:
- name: roberta-base_ag_news
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base_ag_news
This model is a fine-tuned version of [roberta-base](https://huggingface.co./roberta-base) on the ag_news dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3583
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 0.3692 | 1.0 | 7500 | 0.4305 |
| 1.6035 | 2.0 | 15000 | 1.8071 |
| 0.6766 | 3.0 | 22500 | 0.4494 |
| 0.3733 | 4.0 | 30000 | 0.3943 |
| 0.2483 | 5.0 | 37500 | 0.3583 |
### Framework versions
- Transformers 4.27.3
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2 |