Angelectronic commited on
Commit
ed88b85
1 Parent(s): 25d5505

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -0
README.md CHANGED
@@ -23,3 +23,25 @@ This gemma model was trained 2x faster with [Unsloth](https://github.com/unsloth
23
 
24
  # Evaluation
25
  - **ViMMRC test set:** 0.8475 accuracy
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
23
 
24
  # Evaluation
25
  - **ViMMRC test set:** 0.8475 accuracy
26
+
27
+ ### Training hyperparameters
28
+
29
+ The following hyperparameters were used during training:
30
+ - learning_rate: 0.0002
31
+ - train_batch_size: 16
32
+ - eval_batch_size: 16
33
+ - seed: 3407
34
+ - gradient_accumulation_steps: 4
35
+ - total_train_batch_size: 64
36
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
37
+ - lr_scheduler_type: cosine
38
+ - lr_scheduler_warmup_steps: 5
39
+ - num_epochs: 3
40
+
41
+ ### Framework versions
42
+
43
+ - PEFT 0.10.0
44
+ - Transformers 4.40.2
45
+ - Pytorch 2.3.0
46
+ - Datasets 2.19.1
47
+ - Tokenizers 0.19.1