sarpba commited on
Commit
b419333
·
verified ·
1 Parent(s): 9b74d68

Upload 19 files

Browse files
README.md ADDED
@@ -0,0 +1,100 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: openai/whisper-base
5
+ tags:
6
+ - generated_from_trainer
7
+ metrics:
8
+ - wer
9
+ model-index:
10
+ - name: whisper-base-hu-V2
11
+ results: []
12
+ ---
13
+
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # whisper-base-hu-V2
18
+
19
+ This model is a fine-tuned version of [openai/whisper-base](https://huggingface.co/openai/whisper-base) on an unknown dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 0.0880
22
+ - Wer: 0.0960
23
+
24
+ ## Model description
25
+
26
+ More information needed
27
+
28
+ ## Intended uses & limitations
29
+
30
+ More information needed
31
+
32
+ ## Training and evaluation data
33
+
34
+ More information needed
35
+
36
+ ## Training procedure
37
+
38
+ ### Training hyperparameters
39
+
40
+ The following hyperparameters were used during training:
41
+ - learning_rate: 7e-05
42
+ - train_batch_size: 32
43
+ - eval_batch_size: 32
44
+ - seed: 42
45
+ - distributed_type: multi-GPU
46
+ - num_devices: 2
47
+ - gradient_accumulation_steps: 2
48
+ - total_train_batch_size: 128
49
+ - total_eval_batch_size: 64
50
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
51
+ - lr_scheduler_type: linear
52
+ - lr_scheduler_warmup_steps: 1000
53
+ - num_epochs: 3.0
54
+ - mixed_precision_training: Native AMP
55
+
56
+ ### Training results
57
+
58
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
59
+ |:-------------:|:------:|:-----:|:---------------:|:------:|
60
+ | 0.551 | 0.0904 | 1000 | 0.2710 | 0.2694 |
61
+ | 0.4016 | 0.1807 | 2000 | 0.2009 | 0.2061 |
62
+ | 0.3449 | 0.2711 | 3000 | 0.1707 | 0.1770 |
63
+ | 0.3147 | 0.3614 | 4000 | 0.1588 | 0.1650 |
64
+ | 0.2936 | 0.4518 | 5000 | 0.1472 | 0.1551 |
65
+ | 0.2758 | 0.5421 | 6000 | 0.1406 | 0.1479 |
66
+ | 0.2663 | 0.6325 | 7000 | 0.1322 | 0.1393 |
67
+ | 0.2613 | 0.7228 | 8000 | 0.1283 | 0.1402 |
68
+ | 0.2491 | 0.8132 | 9000 | 0.1216 | 0.1319 |
69
+ | 0.238 | 0.9035 | 10000 | 0.1192 | 0.1291 |
70
+ | 0.2287 | 0.9939 | 11000 | 0.1151 | 0.1276 |
71
+ | 0.1798 | 1.0842 | 12000 | 0.1131 | 0.1234 |
72
+ | 0.1791 | 1.1746 | 13000 | 0.1113 | 0.1186 |
73
+ | 0.1787 | 1.2649 | 14000 | 0.1085 | 0.1186 |
74
+ | 0.1771 | 1.3553 | 15000 | 0.1068 | 0.1154 |
75
+ | 0.1728 | 1.4456 | 16000 | 0.1046 | 0.1135 |
76
+ | 0.1714 | 1.5360 | 17000 | 0.1029 | 0.1152 |
77
+ | 0.1706 | 1.6263 | 18000 | 0.1007 | 0.1117 |
78
+ | 0.163 | 1.7167 | 19000 | 0.0998 | 0.1074 |
79
+ | 0.1613 | 1.8070 | 20000 | 0.0982 | 0.1075 |
80
+ | 0.1568 | 1.8974 | 21000 | 0.0967 | 0.1087 |
81
+ | 0.1525 | 1.9878 | 22000 | 0.0945 | 0.1045 |
82
+ | 0.1063 | 2.0781 | 23000 | 0.0967 | 0.1046 |
83
+ | 0.1075 | 2.1684 | 24000 | 0.0951 | 0.1030 |
84
+ | 0.1035 | 2.2588 | 25000 | 0.0936 | 0.1015 |
85
+ | 0.1056 | 2.3491 | 26000 | 0.0928 | 0.1013 |
86
+ | 0.1019 | 2.4395 | 27000 | 0.0921 | 0.1000 |
87
+ | 0.1004 | 2.5298 | 28000 | 0.0911 | 0.0986 |
88
+ | 0.0992 | 2.6202 | 29000 | 0.0904 | 0.0980 |
89
+ | 0.1011 | 2.7105 | 30000 | 0.0898 | 0.0978 |
90
+ | 0.095 | 2.8009 | 31000 | 0.0892 | 0.0975 |
91
+ | 0.0975 | 2.8913 | 32000 | 0.0885 | 0.0960 |
92
+ | 0.0963 | 2.9816 | 33000 | 0.0880 | 0.0962 |
93
+
94
+
95
+ ### Framework versions
96
+
97
+ - Transformers 4.48.0.dev0
98
+ - Pytorch 2.5.1+cu124
99
+ - Datasets 3.2.0
100
+ - Tokenizers 0.21.0
added_tokens.json ADDED
@@ -0,0 +1,1609 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<|0.00|>": 50364,
3
+ "<|0.02|>": 50365,
4
+ "<|0.04|>": 50366,
5
+ "<|0.06|>": 50367,
6
+ "<|0.08|>": 50368,
7
+ "<|0.10|>": 50369,
8
+ "<|0.12|>": 50370,
9
+ "<|0.14|>": 50371,
10
+ "<|0.16|>": 50372,
11
+ "<|0.18|>": 50373,
12
+ "<|0.20|>": 50374,
13
+ "<|0.22|>": 50375,
14
+ "<|0.24|>": 50376,
15
+ "<|0.26|>": 50377,
16
+ "<|0.28|>": 50378,
17
+ "<|0.30|>": 50379,
18
+ "<|0.32|>": 50380,
19
+ "<|0.34|>": 50381,
20
+ "<|0.36|>": 50382,
21
+ "<|0.38|>": 50383,
22
+ "<|0.40|>": 50384,
23
+ "<|0.42|>": 50385,
24
+ "<|0.44|>": 50386,
25
+ "<|0.46|>": 50387,
26
+ "<|0.48|>": 50388,
27
+ "<|0.50|>": 50389,
28
+ "<|0.52|>": 50390,
29
+ "<|0.54|>": 50391,
30
+ "<|0.56|>": 50392,
31
+ "<|0.58|>": 50393,
32
+ "<|0.60|>": 50394,
33
+ "<|0.62|>": 50395,
34
+ "<|0.64|>": 50396,
35
+ "<|0.66|>": 50397,
36
+ "<|0.68|>": 50398,
37
+ "<|0.70|>": 50399,
38
+ "<|0.72|>": 50400,
39
+ "<|0.74|>": 50401,
40
+ "<|0.76|>": 50402,
41
+ "<|0.78|>": 50403,
42
+ "<|0.80|>": 50404,
43
+ "<|0.82|>": 50405,
44
+ "<|0.84|>": 50406,
45
+ "<|0.86|>": 50407,
46
+ "<|0.88|>": 50408,
47
+ "<|0.90|>": 50409,
48
+ "<|0.92|>": 50410,
49
+ "<|0.94|>": 50411,
50
+ "<|0.96|>": 50412,
51
+ "<|0.98|>": 50413,
52
+ "<|1.00|>": 50414,
53
+ "<|1.02|>": 50415,
54
+ "<|1.04|>": 50416,
55
+ "<|1.06|>": 50417,
56
+ "<|1.08|>": 50418,
57
+ "<|1.10|>": 50419,
58
+ "<|1.12|>": 50420,
59
+ "<|1.14|>": 50421,
60
+ "<|1.16|>": 50422,
61
+ "<|1.18|>": 50423,
62
+ "<|1.20|>": 50424,
63
+ "<|1.22|>": 50425,
64
+ "<|1.24|>": 50426,
65
+ "<|1.26|>": 50427,
66
+ "<|1.28|>": 50428,
67
+ "<|1.30|>": 50429,
68
+ "<|1.32|>": 50430,
69
+ "<|1.34|>": 50431,
70
+ "<|1.36|>": 50432,
71
+ "<|1.38|>": 50433,
72
+ "<|1.40|>": 50434,
73
+ "<|1.42|>": 50435,
74
+ "<|1.44|>": 50436,
75
+ "<|1.46|>": 50437,
76
+ "<|1.48|>": 50438,
77
+ "<|1.50|>": 50439,
78
+ "<|1.52|>": 50440,
79
+ "<|1.54|>": 50441,
80
+ "<|1.56|>": 50442,
81
+ "<|1.58|>": 50443,
82
+ "<|1.60|>": 50444,
83
+ "<|1.62|>": 50445,
84
+ "<|1.64|>": 50446,
85
+ "<|1.66|>": 50447,
86
+ "<|1.68|>": 50448,
87
+ "<|1.70|>": 50449,
88
+ "<|1.72|>": 50450,
89
+ "<|1.74|>": 50451,
90
+ "<|1.76|>": 50452,
91
+ "<|1.78|>": 50453,
92
+ "<|1.80|>": 50454,
93
+ "<|1.82|>": 50455,
94
+ "<|1.84|>": 50456,
95
+ "<|1.86|>": 50457,
96
+ "<|1.88|>": 50458,
97
+ "<|1.90|>": 50459,
98
+ "<|1.92|>": 50460,
99
+ "<|1.94|>": 50461,
100
+ "<|1.96|>": 50462,
101
+ "<|1.98|>": 50463,
102
+ "<|10.00|>": 50864,
103
+ "<|10.02|>": 50865,
104
+ "<|10.04|>": 50866,
105
+ "<|10.06|>": 50867,
106
+ "<|10.08|>": 50868,
107
+ "<|10.10|>": 50869,
108
+ "<|10.12|>": 50870,
109
+ "<|10.14|>": 50871,
110
+ "<|10.16|>": 50872,
111
+ "<|10.18|>": 50873,
112
+ "<|10.20|>": 50874,
113
+ "<|10.22|>": 50875,
114
+ "<|10.24|>": 50876,
115
+ "<|10.26|>": 50877,
116
+ "<|10.28|>": 50878,
117
+ "<|10.30|>": 50879,
118
+ "<|10.32|>": 50880,
119
+ "<|10.34|>": 50881,
120
+ "<|10.36|>": 50882,
121
+ "<|10.38|>": 50883,
122
+ "<|10.40|>": 50884,
123
+ "<|10.42|>": 50885,
124
+ "<|10.44|>": 50886,
125
+ "<|10.46|>": 50887,
126
+ "<|10.48|>": 50888,
127
+ "<|10.50|>": 50889,
128
+ "<|10.52|>": 50890,
129
+ "<|10.54|>": 50891,
130
+ "<|10.56|>": 50892,
131
+ "<|10.58|>": 50893,
132
+ "<|10.60|>": 50894,
133
+ "<|10.62|>": 50895,
134
+ "<|10.64|>": 50896,
135
+ "<|10.66|>": 50897,
136
+ "<|10.68|>": 50898,
137
+ "<|10.70|>": 50899,
138
+ "<|10.72|>": 50900,
139
+ "<|10.74|>": 50901,
140
+ "<|10.76|>": 50902,
141
+ "<|10.78|>": 50903,
142
+ "<|10.80|>": 50904,
143
+ "<|10.82|>": 50905,
144
+ "<|10.84|>": 50906,
145
+ "<|10.86|>": 50907,
146
+ "<|10.88|>": 50908,
147
+ "<|10.90|>": 50909,
148
+ "<|10.92|>": 50910,
149
+ "<|10.94|>": 50911,
150
+ "<|10.96|>": 50912,
151
+ "<|10.98|>": 50913,
152
+ "<|11.00|>": 50914,
153
+ "<|11.02|>": 50915,
154
+ "<|11.04|>": 50916,
155
+ "<|11.06|>": 50917,
156
+ "<|11.08|>": 50918,
157
+ "<|11.10|>": 50919,
158
+ "<|11.12|>": 50920,
159
+ "<|11.14|>": 50921,
160
+ "<|11.16|>": 50922,
161
+ "<|11.18|>": 50923,
162
+ "<|11.20|>": 50924,
163
+ "<|11.22|>": 50925,
164
+ "<|11.24|>": 50926,
165
+ "<|11.26|>": 50927,
166
+ "<|11.28|>": 50928,
167
+ "<|11.30|>": 50929,
168
+ "<|11.32|>": 50930,
169
+ "<|11.34|>": 50931,
170
+ "<|11.36|>": 50932,
171
+ "<|11.38|>": 50933,
172
+ "<|11.40|>": 50934,
173
+ "<|11.42|>": 50935,
174
+ "<|11.44|>": 50936,
175
+ "<|11.46|>": 50937,
176
+ "<|11.48|>": 50938,
177
+ "<|11.50|>": 50939,
178
+ "<|11.52|>": 50940,
179
+ "<|11.54|>": 50941,
180
+ "<|11.56|>": 50942,
181
+ "<|11.58|>": 50943,
182
+ "<|11.60|>": 50944,
183
+ "<|11.62|>": 50945,
184
+ "<|11.64|>": 50946,
185
+ "<|11.66|>": 50947,
186
+ "<|11.68|>": 50948,
187
+ "<|11.70|>": 50949,
188
+ "<|11.72|>": 50950,
189
+ "<|11.74|>": 50951,
190
+ "<|11.76|>": 50952,
191
+ "<|11.78|>": 50953,
192
+ "<|11.80|>": 50954,
193
+ "<|11.82|>": 50955,
194
+ "<|11.84|>": 50956,
195
+ "<|11.86|>": 50957,
196
+ "<|11.88|>": 50958,
197
+ "<|11.90|>": 50959,
198
+ "<|11.92|>": 50960,
199
+ "<|11.94|>": 50961,
200
+ "<|11.96|>": 50962,
201
+ "<|11.98|>": 50963,
202
+ "<|12.00|>": 50964,
203
+ "<|12.02|>": 50965,
204
+ "<|12.04|>": 50966,
205
+ "<|12.06|>": 50967,
206
+ "<|12.08|>": 50968,
207
+ "<|12.10|>": 50969,
208
+ "<|12.12|>": 50970,
209
+ "<|12.14|>": 50971,
210
+ "<|12.16|>": 50972,
211
+ "<|12.18|>": 50973,
212
+ "<|12.20|>": 50974,
213
+ "<|12.22|>": 50975,
214
+ "<|12.24|>": 50976,
215
+ "<|12.26|>": 50977,
216
+ "<|12.28|>": 50978,
217
+ "<|12.30|>": 50979,
218
+ "<|12.32|>": 50980,
219
+ "<|12.34|>": 50981,
220
+ "<|12.36|>": 50982,
221
+ "<|12.38|>": 50983,
222
+ "<|12.40|>": 50984,
223
+ "<|12.42|>": 50985,
224
+ "<|12.44|>": 50986,
225
+ "<|12.46|>": 50987,
226
+ "<|12.48|>": 50988,
227
+ "<|12.50|>": 50989,
228
+ "<|12.52|>": 50990,
229
+ "<|12.54|>": 50991,
230
+ "<|12.56|>": 50992,
231
+ "<|12.58|>": 50993,
232
+ "<|12.60|>": 50994,
233
+ "<|12.62|>": 50995,
234
+ "<|12.64|>": 50996,
235
+ "<|12.66|>": 50997,
236
+ "<|12.68|>": 50998,
237
+ "<|12.70|>": 50999,
238
+ "<|12.72|>": 51000,
239
+ "<|12.74|>": 51001,
240
+ "<|12.76|>": 51002,
241
+ "<|12.78|>": 51003,
242
+ "<|12.80|>": 51004,
243
+ "<|12.82|>": 51005,
244
+ "<|12.84|>": 51006,
245
+ "<|12.86|>": 51007,
246
+ "<|12.88|>": 51008,
247
+ "<|12.90|>": 51009,
248
+ "<|12.92|>": 51010,
249
+ "<|12.94|>": 51011,
250
+ "<|12.96|>": 51012,
251
+ "<|12.98|>": 51013,
252
+ "<|13.00|>": 51014,
253
+ "<|13.02|>": 51015,
254
+ "<|13.04|>": 51016,
255
+ "<|13.06|>": 51017,
256
+ "<|13.08|>": 51018,
257
+ "<|13.10|>": 51019,
258
+ "<|13.12|>": 51020,
259
+ "<|13.14|>": 51021,
260
+ "<|13.16|>": 51022,
261
+ "<|13.18|>": 51023,
262
+ "<|13.20|>": 51024,
263
+ "<|13.22|>": 51025,
264
+ "<|13.24|>": 51026,
265
+ "<|13.26|>": 51027,
266
+ "<|13.28|>": 51028,
267
+ "<|13.30|>": 51029,
268
+ "<|13.32|>": 51030,
269
+ "<|13.34|>": 51031,
270
+ "<|13.36|>": 51032,
271
+ "<|13.38|>": 51033,
272
+ "<|13.40|>": 51034,
273
+ "<|13.42|>": 51035,
274
+ "<|13.44|>": 51036,
275
+ "<|13.46|>": 51037,
276
+ "<|13.48|>": 51038,
277
+ "<|13.50|>": 51039,
278
+ "<|13.52|>": 51040,
279
+ "<|13.54|>": 51041,
280
+ "<|13.56|>": 51042,
281
+ "<|13.58|>": 51043,
282
+ "<|13.60|>": 51044,
283
+ "<|13.62|>": 51045,
284
+ "<|13.64|>": 51046,
285
+ "<|13.66|>": 51047,
286
+ "<|13.68|>": 51048,
287
+ "<|13.70|>": 51049,
288
+ "<|13.72|>": 51050,
289
+ "<|13.74|>": 51051,
290
+ "<|13.76|>": 51052,
291
+ "<|13.78|>": 51053,
292
+ "<|13.80|>": 51054,
293
+ "<|13.82|>": 51055,
294
+ "<|13.84|>": 51056,
295
+ "<|13.86|>": 51057,
296
+ "<|13.88|>": 51058,
297
+ "<|13.90|>": 51059,
298
+ "<|13.92|>": 51060,
299
+ "<|13.94|>": 51061,
300
+ "<|13.96|>": 51062,
301
+ "<|13.98|>": 51063,
302
+ "<|14.00|>": 51064,
303
+ "<|14.02|>": 51065,
304
+ "<|14.04|>": 51066,
305
+ "<|14.06|>": 51067,
306
+ "<|14.08|>": 51068,
307
+ "<|14.10|>": 51069,
308
+ "<|14.12|>": 51070,
309
+ "<|14.14|>": 51071,
310
+ "<|14.16|>": 51072,
311
+ "<|14.18|>": 51073,
312
+ "<|14.20|>": 51074,
313
+ "<|14.22|>": 51075,
314
+ "<|14.24|>": 51076,
315
+ "<|14.26|>": 51077,
316
+ "<|14.28|>": 51078,
317
+ "<|14.30|>": 51079,
318
+ "<|14.32|>": 51080,
319
+ "<|14.34|>": 51081,
320
+ "<|14.36|>": 51082,
321
+ "<|14.38|>": 51083,
322
+ "<|14.40|>": 51084,
323
+ "<|14.42|>": 51085,
324
+ "<|14.44|>": 51086,
325
+ "<|14.46|>": 51087,
326
+ "<|14.48|>": 51088,
327
+ "<|14.50|>": 51089,
328
+ "<|14.52|>": 51090,
329
+ "<|14.54|>": 51091,
330
+ "<|14.56|>": 51092,
331
+ "<|14.58|>": 51093,
332
+ "<|14.60|>": 51094,
333
+ "<|14.62|>": 51095,
334
+ "<|14.64|>": 51096,
335
+ "<|14.66|>": 51097,
336
+ "<|14.68|>": 51098,
337
+ "<|14.70|>": 51099,
338
+ "<|14.72|>": 51100,
339
+ "<|14.74|>": 51101,
340
+ "<|14.76|>": 51102,
341
+ "<|14.78|>": 51103,
342
+ "<|14.80|>": 51104,
343
+ "<|14.82|>": 51105,
344
+ "<|14.84|>": 51106,
345
+ "<|14.86|>": 51107,
346
+ "<|14.88|>": 51108,
347
+ "<|14.90|>": 51109,
348
+ "<|14.92|>": 51110,
349
+ "<|14.94|>": 51111,
350
+ "<|14.96|>": 51112,
351
+ "<|14.98|>": 51113,
352
+ "<|15.00|>": 51114,
353
+ "<|15.02|>": 51115,
354
+ "<|15.04|>": 51116,
355
+ "<|15.06|>": 51117,
356
+ "<|15.08|>": 51118,
357
+ "<|15.10|>": 51119,
358
+ "<|15.12|>": 51120,
359
+ "<|15.14|>": 51121,
360
+ "<|15.16|>": 51122,
361
+ "<|15.18|>": 51123,
362
+ "<|15.20|>": 51124,
363
+ "<|15.22|>": 51125,
364
+ "<|15.24|>": 51126,
365
+ "<|15.26|>": 51127,
366
+ "<|15.28|>": 51128,
367
+ "<|15.30|>": 51129,
368
+ "<|15.32|>": 51130,
369
+ "<|15.34|>": 51131,
370
+ "<|15.36|>": 51132,
371
+ "<|15.38|>": 51133,
372
+ "<|15.40|>": 51134,
373
+ "<|15.42|>": 51135,
374
+ "<|15.44|>": 51136,
375
+ "<|15.46|>": 51137,
376
+ "<|15.48|>": 51138,
377
+ "<|15.50|>": 51139,
378
+ "<|15.52|>": 51140,
379
+ "<|15.54|>": 51141,
380
+ "<|15.56|>": 51142,
381
+ "<|15.58|>": 51143,
382
+ "<|15.60|>": 51144,
383
+ "<|15.62|>": 51145,
384
+ "<|15.64|>": 51146,
385
+ "<|15.66|>": 51147,
386
+ "<|15.68|>": 51148,
387
+ "<|15.70|>": 51149,
388
+ "<|15.72|>": 51150,
389
+ "<|15.74|>": 51151,
390
+ "<|15.76|>": 51152,
391
+ "<|15.78|>": 51153,
392
+ "<|15.80|>": 51154,
393
+ "<|15.82|>": 51155,
394
+ "<|15.84|>": 51156,
395
+ "<|15.86|>": 51157,
396
+ "<|15.88|>": 51158,
397
+ "<|15.90|>": 51159,
398
+ "<|15.92|>": 51160,
399
+ "<|15.94|>": 51161,
400
+ "<|15.96|>": 51162,
401
+ "<|15.98|>": 51163,
402
+ "<|16.00|>": 51164,
403
+ "<|16.02|>": 51165,
404
+ "<|16.04|>": 51166,
405
+ "<|16.06|>": 51167,
406
+ "<|16.08|>": 51168,
407
+ "<|16.10|>": 51169,
408
+ "<|16.12|>": 51170,
409
+ "<|16.14|>": 51171,
410
+ "<|16.16|>": 51172,
411
+ "<|16.18|>": 51173,
412
+ "<|16.20|>": 51174,
413
+ "<|16.22|>": 51175,
414
+ "<|16.24|>": 51176,
415
+ "<|16.26|>": 51177,
416
+ "<|16.28|>": 51178,
417
+ "<|16.30|>": 51179,
418
+ "<|16.32|>": 51180,
419
+ "<|16.34|>": 51181,
420
+ "<|16.36|>": 51182,
421
+ "<|16.38|>": 51183,
422
+ "<|16.40|>": 51184,
423
+ "<|16.42|>": 51185,
424
+ "<|16.44|>": 51186,
425
+ "<|16.46|>": 51187,
426
+ "<|16.48|>": 51188,
427
+ "<|16.50|>": 51189,
428
+ "<|16.52|>": 51190,
429
+ "<|16.54|>": 51191,
430
+ "<|16.56|>": 51192,
431
+ "<|16.58|>": 51193,
432
+ "<|16.60|>": 51194,
433
+ "<|16.62|>": 51195,
434
+ "<|16.64|>": 51196,
435
+ "<|16.66|>": 51197,
436
+ "<|16.68|>": 51198,
437
+ "<|16.70|>": 51199,
438
+ "<|16.72|>": 51200,
439
+ "<|16.74|>": 51201,
440
+ "<|16.76|>": 51202,
441
+ "<|16.78|>": 51203,
442
+ "<|16.80|>": 51204,
443
+ "<|16.82|>": 51205,
444
+ "<|16.84|>": 51206,
445
+ "<|16.86|>": 51207,
446
+ "<|16.88|>": 51208,
447
+ "<|16.90|>": 51209,
448
+ "<|16.92|>": 51210,
449
+ "<|16.94|>": 51211,
450
+ "<|16.96|>": 51212,
451
+ "<|16.98|>": 51213,
452
+ "<|17.00|>": 51214,
453
+ "<|17.02|>": 51215,
454
+ "<|17.04|>": 51216,
455
+ "<|17.06|>": 51217,
456
+ "<|17.08|>": 51218,
457
+ "<|17.10|>": 51219,
458
+ "<|17.12|>": 51220,
459
+ "<|17.14|>": 51221,
460
+ "<|17.16|>": 51222,
461
+ "<|17.18|>": 51223,
462
+ "<|17.20|>": 51224,
463
+ "<|17.22|>": 51225,
464
+ "<|17.24|>": 51226,
465
+ "<|17.26|>": 51227,
466
+ "<|17.28|>": 51228,
467
+ "<|17.30|>": 51229,
468
+ "<|17.32|>": 51230,
469
+ "<|17.34|>": 51231,
470
+ "<|17.36|>": 51232,
471
+ "<|17.38|>": 51233,
472
+ "<|17.40|>": 51234,
473
+ "<|17.42|>": 51235,
474
+ "<|17.44|>": 51236,
475
+ "<|17.46|>": 51237,
476
+ "<|17.48|>": 51238,
477
+ "<|17.50|>": 51239,
478
+ "<|17.52|>": 51240,
479
+ "<|17.54|>": 51241,
480
+ "<|17.56|>": 51242,
481
+ "<|17.58|>": 51243,
482
+ "<|17.60|>": 51244,
483
+ "<|17.62|>": 51245,
484
+ "<|17.64|>": 51246,
485
+ "<|17.66|>": 51247,
486
+ "<|17.68|>": 51248,
487
+ "<|17.70|>": 51249,
488
+ "<|17.72|>": 51250,
489
+ "<|17.74|>": 51251,
490
+ "<|17.76|>": 51252,
491
+ "<|17.78|>": 51253,
492
+ "<|17.80|>": 51254,
493
+ "<|17.82|>": 51255,
494
+ "<|17.84|>": 51256,
495
+ "<|17.86|>": 51257,
496
+ "<|17.88|>": 51258,
497
+ "<|17.90|>": 51259,
498
+ "<|17.92|>": 51260,
499
+ "<|17.94|>": 51261,
500
+ "<|17.96|>": 51262,
501
+ "<|17.98|>": 51263,
502
+ "<|18.00|>": 51264,
503
+ "<|18.02|>": 51265,
504
+ "<|18.04|>": 51266,
505
+ "<|18.06|>": 51267,
506
+ "<|18.08|>": 51268,
507
+ "<|18.10|>": 51269,
508
+ "<|18.12|>": 51270,
509
+ "<|18.14|>": 51271,
510
+ "<|18.16|>": 51272,
511
+ "<|18.18|>": 51273,
512
+ "<|18.20|>": 51274,
513
+ "<|18.22|>": 51275,
514
+ "<|18.24|>": 51276,
515
+ "<|18.26|>": 51277,
516
+ "<|18.28|>": 51278,
517
+ "<|18.30|>": 51279,
518
+ "<|18.32|>": 51280,
519
+ "<|18.34|>": 51281,
520
+ "<|18.36|>": 51282,
521
+ "<|18.38|>": 51283,
522
+ "<|18.40|>": 51284,
523
+ "<|18.42|>": 51285,
524
+ "<|18.44|>": 51286,
525
+ "<|18.46|>": 51287,
526
+ "<|18.48|>": 51288,
527
+ "<|18.50|>": 51289,
528
+ "<|18.52|>": 51290,
529
+ "<|18.54|>": 51291,
530
+ "<|18.56|>": 51292,
531
+ "<|18.58|>": 51293,
532
+ "<|18.60|>": 51294,
533
+ "<|18.62|>": 51295,
534
+ "<|18.64|>": 51296,
535
+ "<|18.66|>": 51297,
536
+ "<|18.68|>": 51298,
537
+ "<|18.70|>": 51299,
538
+ "<|18.72|>": 51300,
539
+ "<|18.74|>": 51301,
540
+ "<|18.76|>": 51302,
541
+ "<|18.78|>": 51303,
542
+ "<|18.80|>": 51304,
543
+ "<|18.82|>": 51305,
544
+ "<|18.84|>": 51306,
545
+ "<|18.86|>": 51307,
546
+ "<|18.88|>": 51308,
547
+ "<|18.90|>": 51309,
548
+ "<|18.92|>": 51310,
549
+ "<|18.94|>": 51311,
550
+ "<|18.96|>": 51312,
551
+ "<|18.98|>": 51313,
552
+ "<|19.00|>": 51314,
553
+ "<|19.02|>": 51315,
554
+ "<|19.04|>": 51316,
555
+ "<|19.06|>": 51317,
556
+ "<|19.08|>": 51318,
557
+ "<|19.10|>": 51319,
558
+ "<|19.12|>": 51320,
559
+ "<|19.14|>": 51321,
560
+ "<|19.16|>": 51322,
561
+ "<|19.18|>": 51323,
562
+ "<|19.20|>": 51324,
563
+ "<|19.22|>": 51325,
564
+ "<|19.24|>": 51326,
565
+ "<|19.26|>": 51327,
566
+ "<|19.28|>": 51328,
567
+ "<|19.30|>": 51329,
568
+ "<|19.32|>": 51330,
569
+ "<|19.34|>": 51331,
570
+ "<|19.36|>": 51332,
571
+ "<|19.38|>": 51333,
572
+ "<|19.40|>": 51334,
573
+ "<|19.42|>": 51335,
574
+ "<|19.44|>": 51336,
575
+ "<|19.46|>": 51337,
576
+ "<|19.48|>": 51338,
577
+ "<|19.50|>": 51339,
578
+ "<|19.52|>": 51340,
579
+ "<|19.54|>": 51341,
580
+ "<|19.56|>": 51342,
581
+ "<|19.58|>": 51343,
582
+ "<|19.60|>": 51344,
583
+ "<|19.62|>": 51345,
584
+ "<|19.64|>": 51346,
585
+ "<|19.66|>": 51347,
586
+ "<|19.68|>": 51348,
587
+ "<|19.70|>": 51349,
588
+ "<|19.72|>": 51350,
589
+ "<|19.74|>": 51351,
590
+ "<|19.76|>": 51352,
591
+ "<|19.78|>": 51353,
592
+ "<|19.80|>": 51354,
593
+ "<|19.82|>": 51355,
594
+ "<|19.84|>": 51356,
595
+ "<|19.86|>": 51357,
596
+ "<|19.88|>": 51358,
597
+ "<|19.90|>": 51359,
598
+ "<|19.92|>": 51360,
599
+ "<|19.94|>": 51361,
600
+ "<|19.96|>": 51362,
601
+ "<|19.98|>": 51363,
602
+ "<|2.00|>": 50464,
603
+ "<|2.02|>": 50465,
604
+ "<|2.04|>": 50466,
605
+ "<|2.06|>": 50467,
606
+ "<|2.08|>": 50468,
607
+ "<|2.10|>": 50469,
608
+ "<|2.12|>": 50470,
609
+ "<|2.14|>": 50471,
610
+ "<|2.16|>": 50472,
611
+ "<|2.18|>": 50473,
612
+ "<|2.20|>": 50474,
613
+ "<|2.22|>": 50475,
614
+ "<|2.24|>": 50476,
615
+ "<|2.26|>": 50477,
616
+ "<|2.28|>": 50478,
617
+ "<|2.30|>": 50479,
618
+ "<|2.32|>": 50480,
619
+ "<|2.34|>": 50481,
620
+ "<|2.36|>": 50482,
621
+ "<|2.38|>": 50483,
622
+ "<|2.40|>": 50484,
623
+ "<|2.42|>": 50485,
624
+ "<|2.44|>": 50486,
625
+ "<|2.46|>": 50487,
626
+ "<|2.48|>": 50488,
627
+ "<|2.50|>": 50489,
628
+ "<|2.52|>": 50490,
629
+ "<|2.54|>": 50491,
630
+ "<|2.56|>": 50492,
631
+ "<|2.58|>": 50493,
632
+ "<|2.60|>": 50494,
633
+ "<|2.62|>": 50495,
634
+ "<|2.64|>": 50496,
635
+ "<|2.66|>": 50497,
636
+ "<|2.68|>": 50498,
637
+ "<|2.70|>": 50499,
638
+ "<|2.72|>": 50500,
639
+ "<|2.74|>": 50501,
640
+ "<|2.76|>": 50502,
641
+ "<|2.78|>": 50503,
642
+ "<|2.80|>": 50504,
643
+ "<|2.82|>": 50505,
644
+ "<|2.84|>": 50506,
645
+ "<|2.86|>": 50507,
646
+ "<|2.88|>": 50508,
647
+ "<|2.90|>": 50509,
648
+ "<|2.92|>": 50510,
649
+ "<|2.94|>": 50511,
650
+ "<|2.96|>": 50512,
651
+ "<|2.98|>": 50513,
652
+ "<|20.00|>": 51364,
653
+ "<|20.02|>": 51365,
654
+ "<|20.04|>": 51366,
655
+ "<|20.06|>": 51367,
656
+ "<|20.08|>": 51368,
657
+ "<|20.10|>": 51369,
658
+ "<|20.12|>": 51370,
659
+ "<|20.14|>": 51371,
660
+ "<|20.16|>": 51372,
661
+ "<|20.18|>": 51373,
662
+ "<|20.20|>": 51374,
663
+ "<|20.22|>": 51375,
664
+ "<|20.24|>": 51376,
665
+ "<|20.26|>": 51377,
666
+ "<|20.28|>": 51378,
667
+ "<|20.30|>": 51379,
668
+ "<|20.32|>": 51380,
669
+ "<|20.34|>": 51381,
670
+ "<|20.36|>": 51382,
671
+ "<|20.38|>": 51383,
672
+ "<|20.40|>": 51384,
673
+ "<|20.42|>": 51385,
674
+ "<|20.44|>": 51386,
675
+ "<|20.46|>": 51387,
676
+ "<|20.48|>": 51388,
677
+ "<|20.50|>": 51389,
678
+ "<|20.52|>": 51390,
679
+ "<|20.54|>": 51391,
680
+ "<|20.56|>": 51392,
681
+ "<|20.58|>": 51393,
682
+ "<|20.60|>": 51394,
683
+ "<|20.62|>": 51395,
684
+ "<|20.64|>": 51396,
685
+ "<|20.66|>": 51397,
686
+ "<|20.68|>": 51398,
687
+ "<|20.70|>": 51399,
688
+ "<|20.72|>": 51400,
689
+ "<|20.74|>": 51401,
690
+ "<|20.76|>": 51402,
691
+ "<|20.78|>": 51403,
692
+ "<|20.80|>": 51404,
693
+ "<|20.82|>": 51405,
694
+ "<|20.84|>": 51406,
695
+ "<|20.86|>": 51407,
696
+ "<|20.88|>": 51408,
697
+ "<|20.90|>": 51409,
698
+ "<|20.92|>": 51410,
699
+ "<|20.94|>": 51411,
700
+ "<|20.96|>": 51412,
701
+ "<|20.98|>": 51413,
702
+ "<|21.00|>": 51414,
703
+ "<|21.02|>": 51415,
704
+ "<|21.04|>": 51416,
705
+ "<|21.06|>": 51417,
706
+ "<|21.08|>": 51418,
707
+ "<|21.10|>": 51419,
708
+ "<|21.12|>": 51420,
709
+ "<|21.14|>": 51421,
710
+ "<|21.16|>": 51422,
711
+ "<|21.18|>": 51423,
712
+ "<|21.20|>": 51424,
713
+ "<|21.22|>": 51425,
714
+ "<|21.24|>": 51426,
715
+ "<|21.26|>": 51427,
716
+ "<|21.28|>": 51428,
717
+ "<|21.30|>": 51429,
718
+ "<|21.32|>": 51430,
719
+ "<|21.34|>": 51431,
720
+ "<|21.36|>": 51432,
721
+ "<|21.38|>": 51433,
722
+ "<|21.40|>": 51434,
723
+ "<|21.42|>": 51435,
724
+ "<|21.44|>": 51436,
725
+ "<|21.46|>": 51437,
726
+ "<|21.48|>": 51438,
727
+ "<|21.50|>": 51439,
728
+ "<|21.52|>": 51440,
729
+ "<|21.54|>": 51441,
730
+ "<|21.56|>": 51442,
731
+ "<|21.58|>": 51443,
732
+ "<|21.60|>": 51444,
733
+ "<|21.62|>": 51445,
734
+ "<|21.64|>": 51446,
735
+ "<|21.66|>": 51447,
736
+ "<|21.68|>": 51448,
737
+ "<|21.70|>": 51449,
738
+ "<|21.72|>": 51450,
739
+ "<|21.74|>": 51451,
740
+ "<|21.76|>": 51452,
741
+ "<|21.78|>": 51453,
742
+ "<|21.80|>": 51454,
743
+ "<|21.82|>": 51455,
744
+ "<|21.84|>": 51456,
745
+ "<|21.86|>": 51457,
746
+ "<|21.88|>": 51458,
747
+ "<|21.90|>": 51459,
748
+ "<|21.92|>": 51460,
749
+ "<|21.94|>": 51461,
750
+ "<|21.96|>": 51462,
751
+ "<|21.98|>": 51463,
752
+ "<|22.00|>": 51464,
753
+ "<|22.02|>": 51465,
754
+ "<|22.04|>": 51466,
755
+ "<|22.06|>": 51467,
756
+ "<|22.08|>": 51468,
757
+ "<|22.10|>": 51469,
758
+ "<|22.12|>": 51470,
759
+ "<|22.14|>": 51471,
760
+ "<|22.16|>": 51472,
761
+ "<|22.18|>": 51473,
762
+ "<|22.20|>": 51474,
763
+ "<|22.22|>": 51475,
764
+ "<|22.24|>": 51476,
765
+ "<|22.26|>": 51477,
766
+ "<|22.28|>": 51478,
767
+ "<|22.30|>": 51479,
768
+ "<|22.32|>": 51480,
769
+ "<|22.34|>": 51481,
770
+ "<|22.36|>": 51482,
771
+ "<|22.38|>": 51483,
772
+ "<|22.40|>": 51484,
773
+ "<|22.42|>": 51485,
774
+ "<|22.44|>": 51486,
775
+ "<|22.46|>": 51487,
776
+ "<|22.48|>": 51488,
777
+ "<|22.50|>": 51489,
778
+ "<|22.52|>": 51490,
779
+ "<|22.54|>": 51491,
780
+ "<|22.56|>": 51492,
781
+ "<|22.58|>": 51493,
782
+ "<|22.60|>": 51494,
783
+ "<|22.62|>": 51495,
784
+ "<|22.64|>": 51496,
785
+ "<|22.66|>": 51497,
786
+ "<|22.68|>": 51498,
787
+ "<|22.70|>": 51499,
788
+ "<|22.72|>": 51500,
789
+ "<|22.74|>": 51501,
790
+ "<|22.76|>": 51502,
791
+ "<|22.78|>": 51503,
792
+ "<|22.80|>": 51504,
793
+ "<|22.82|>": 51505,
794
+ "<|22.84|>": 51506,
795
+ "<|22.86|>": 51507,
796
+ "<|22.88|>": 51508,
797
+ "<|22.90|>": 51509,
798
+ "<|22.92|>": 51510,
799
+ "<|22.94|>": 51511,
800
+ "<|22.96|>": 51512,
801
+ "<|22.98|>": 51513,
802
+ "<|23.00|>": 51514,
803
+ "<|23.02|>": 51515,
804
+ "<|23.04|>": 51516,
805
+ "<|23.06|>": 51517,
806
+ "<|23.08|>": 51518,
807
+ "<|23.10|>": 51519,
808
+ "<|23.12|>": 51520,
809
+ "<|23.14|>": 51521,
810
+ "<|23.16|>": 51522,
811
+ "<|23.18|>": 51523,
812
+ "<|23.20|>": 51524,
813
+ "<|23.22|>": 51525,
814
+ "<|23.24|>": 51526,
815
+ "<|23.26|>": 51527,
816
+ "<|23.28|>": 51528,
817
+ "<|23.30|>": 51529,
818
+ "<|23.32|>": 51530,
819
+ "<|23.34|>": 51531,
820
+ "<|23.36|>": 51532,
821
+ "<|23.38|>": 51533,
822
+ "<|23.40|>": 51534,
823
+ "<|23.42|>": 51535,
824
+ "<|23.44|>": 51536,
825
+ "<|23.46|>": 51537,
826
+ "<|23.48|>": 51538,
827
+ "<|23.50|>": 51539,
828
+ "<|23.52|>": 51540,
829
+ "<|23.54|>": 51541,
830
+ "<|23.56|>": 51542,
831
+ "<|23.58|>": 51543,
832
+ "<|23.60|>": 51544,
833
+ "<|23.62|>": 51545,
834
+ "<|23.64|>": 51546,
835
+ "<|23.66|>": 51547,
836
+ "<|23.68|>": 51548,
837
+ "<|23.70|>": 51549,
838
+ "<|23.72|>": 51550,
839
+ "<|23.74|>": 51551,
840
+ "<|23.76|>": 51552,
841
+ "<|23.78|>": 51553,
842
+ "<|23.80|>": 51554,
843
+ "<|23.82|>": 51555,
844
+ "<|23.84|>": 51556,
845
+ "<|23.86|>": 51557,
846
+ "<|23.88|>": 51558,
847
+ "<|23.90|>": 51559,
848
+ "<|23.92|>": 51560,
849
+ "<|23.94|>": 51561,
850
+ "<|23.96|>": 51562,
851
+ "<|23.98|>": 51563,
852
+ "<|24.00|>": 51564,
853
+ "<|24.02|>": 51565,
854
+ "<|24.04|>": 51566,
855
+ "<|24.06|>": 51567,
856
+ "<|24.08|>": 51568,
857
+ "<|24.10|>": 51569,
858
+ "<|24.12|>": 51570,
859
+ "<|24.14|>": 51571,
860
+ "<|24.16|>": 51572,
861
+ "<|24.18|>": 51573,
862
+ "<|24.20|>": 51574,
863
+ "<|24.22|>": 51575,
864
+ "<|24.24|>": 51576,
865
+ "<|24.26|>": 51577,
866
+ "<|24.28|>": 51578,
867
+ "<|24.30|>": 51579,
868
+ "<|24.32|>": 51580,
869
+ "<|24.34|>": 51581,
870
+ "<|24.36|>": 51582,
871
+ "<|24.38|>": 51583,
872
+ "<|24.40|>": 51584,
873
+ "<|24.42|>": 51585,
874
+ "<|24.44|>": 51586,
875
+ "<|24.46|>": 51587,
876
+ "<|24.48|>": 51588,
877
+ "<|24.50|>": 51589,
878
+ "<|24.52|>": 51590,
879
+ "<|24.54|>": 51591,
880
+ "<|24.56|>": 51592,
881
+ "<|24.58|>": 51593,
882
+ "<|24.60|>": 51594,
883
+ "<|24.62|>": 51595,
884
+ "<|24.64|>": 51596,
885
+ "<|24.66|>": 51597,
886
+ "<|24.68|>": 51598,
887
+ "<|24.70|>": 51599,
888
+ "<|24.72|>": 51600,
889
+ "<|24.74|>": 51601,
890
+ "<|24.76|>": 51602,
891
+ "<|24.78|>": 51603,
892
+ "<|24.80|>": 51604,
893
+ "<|24.82|>": 51605,
894
+ "<|24.84|>": 51606,
895
+ "<|24.86|>": 51607,
896
+ "<|24.88|>": 51608,
897
+ "<|24.90|>": 51609,
898
+ "<|24.92|>": 51610,
899
+ "<|24.94|>": 51611,
900
+ "<|24.96|>": 51612,
901
+ "<|24.98|>": 51613,
902
+ "<|25.00|>": 51614,
903
+ "<|25.02|>": 51615,
904
+ "<|25.04|>": 51616,
905
+ "<|25.06|>": 51617,
906
+ "<|25.08|>": 51618,
907
+ "<|25.10|>": 51619,
908
+ "<|25.12|>": 51620,
909
+ "<|25.14|>": 51621,
910
+ "<|25.16|>": 51622,
911
+ "<|25.18|>": 51623,
912
+ "<|25.20|>": 51624,
913
+ "<|25.22|>": 51625,
914
+ "<|25.24|>": 51626,
915
+ "<|25.26|>": 51627,
916
+ "<|25.28|>": 51628,
917
+ "<|25.30|>": 51629,
918
+ "<|25.32|>": 51630,
919
+ "<|25.34|>": 51631,
920
+ "<|25.36|>": 51632,
921
+ "<|25.38|>": 51633,
922
+ "<|25.40|>": 51634,
923
+ "<|25.42|>": 51635,
924
+ "<|25.44|>": 51636,
925
+ "<|25.46|>": 51637,
926
+ "<|25.48|>": 51638,
927
+ "<|25.50|>": 51639,
928
+ "<|25.52|>": 51640,
929
+ "<|25.54|>": 51641,
930
+ "<|25.56|>": 51642,
931
+ "<|25.58|>": 51643,
932
+ "<|25.60|>": 51644,
933
+ "<|25.62|>": 51645,
934
+ "<|25.64|>": 51646,
935
+ "<|25.66|>": 51647,
936
+ "<|25.68|>": 51648,
937
+ "<|25.70|>": 51649,
938
+ "<|25.72|>": 51650,
939
+ "<|25.74|>": 51651,
940
+ "<|25.76|>": 51652,
941
+ "<|25.78|>": 51653,
942
+ "<|25.80|>": 51654,
943
+ "<|25.82|>": 51655,
944
+ "<|25.84|>": 51656,
945
+ "<|25.86|>": 51657,
946
+ "<|25.88|>": 51658,
947
+ "<|25.90|>": 51659,
948
+ "<|25.92|>": 51660,
949
+ "<|25.94|>": 51661,
950
+ "<|25.96|>": 51662,
951
+ "<|25.98|>": 51663,
952
+ "<|26.00|>": 51664,
953
+ "<|26.02|>": 51665,
954
+ "<|26.04|>": 51666,
955
+ "<|26.06|>": 51667,
956
+ "<|26.08|>": 51668,
957
+ "<|26.10|>": 51669,
958
+ "<|26.12|>": 51670,
959
+ "<|26.14|>": 51671,
960
+ "<|26.16|>": 51672,
961
+ "<|26.18|>": 51673,
962
+ "<|26.20|>": 51674,
963
+ "<|26.22|>": 51675,
964
+ "<|26.24|>": 51676,
965
+ "<|26.26|>": 51677,
966
+ "<|26.28|>": 51678,
967
+ "<|26.30|>": 51679,
968
+ "<|26.32|>": 51680,
969
+ "<|26.34|>": 51681,
970
+ "<|26.36|>": 51682,
971
+ "<|26.38|>": 51683,
972
+ "<|26.40|>": 51684,
973
+ "<|26.42|>": 51685,
974
+ "<|26.44|>": 51686,
975
+ "<|26.46|>": 51687,
976
+ "<|26.48|>": 51688,
977
+ "<|26.50|>": 51689,
978
+ "<|26.52|>": 51690,
979
+ "<|26.54|>": 51691,
980
+ "<|26.56|>": 51692,
981
+ "<|26.58|>": 51693,
982
+ "<|26.60|>": 51694,
983
+ "<|26.62|>": 51695,
984
+ "<|26.64|>": 51696,
985
+ "<|26.66|>": 51697,
986
+ "<|26.68|>": 51698,
987
+ "<|26.70|>": 51699,
988
+ "<|26.72|>": 51700,
989
+ "<|26.74|>": 51701,
990
+ "<|26.76|>": 51702,
991
+ "<|26.78|>": 51703,
992
+ "<|26.80|>": 51704,
993
+ "<|26.82|>": 51705,
994
+ "<|26.84|>": 51706,
995
+ "<|26.86|>": 51707,
996
+ "<|26.88|>": 51708,
997
+ "<|26.90|>": 51709,
998
+ "<|26.92|>": 51710,
999
+ "<|26.94|>": 51711,
1000
+ "<|26.96|>": 51712,
1001
+ "<|26.98|>": 51713,
1002
+ "<|27.00|>": 51714,
1003
+ "<|27.02|>": 51715,
1004
+ "<|27.04|>": 51716,
1005
+ "<|27.06|>": 51717,
1006
+ "<|27.08|>": 51718,
1007
+ "<|27.10|>": 51719,
1008
+ "<|27.12|>": 51720,
1009
+ "<|27.14|>": 51721,
1010
+ "<|27.16|>": 51722,
1011
+ "<|27.18|>": 51723,
1012
+ "<|27.20|>": 51724,
1013
+ "<|27.22|>": 51725,
1014
+ "<|27.24|>": 51726,
1015
+ "<|27.26|>": 51727,
1016
+ "<|27.28|>": 51728,
1017
+ "<|27.30|>": 51729,
1018
+ "<|27.32|>": 51730,
1019
+ "<|27.34|>": 51731,
1020
+ "<|27.36|>": 51732,
1021
+ "<|27.38|>": 51733,
1022
+ "<|27.40|>": 51734,
1023
+ "<|27.42|>": 51735,
1024
+ "<|27.44|>": 51736,
1025
+ "<|27.46|>": 51737,
1026
+ "<|27.48|>": 51738,
1027
+ "<|27.50|>": 51739,
1028
+ "<|27.52|>": 51740,
1029
+ "<|27.54|>": 51741,
1030
+ "<|27.56|>": 51742,
1031
+ "<|27.58|>": 51743,
1032
+ "<|27.60|>": 51744,
1033
+ "<|27.62|>": 51745,
1034
+ "<|27.64|>": 51746,
1035
+ "<|27.66|>": 51747,
1036
+ "<|27.68|>": 51748,
1037
+ "<|27.70|>": 51749,
1038
+ "<|27.72|>": 51750,
1039
+ "<|27.74|>": 51751,
1040
+ "<|27.76|>": 51752,
1041
+ "<|27.78|>": 51753,
1042
+ "<|27.80|>": 51754,
1043
+ "<|27.82|>": 51755,
1044
+ "<|27.84|>": 51756,
1045
+ "<|27.86|>": 51757,
1046
+ "<|27.88|>": 51758,
1047
+ "<|27.90|>": 51759,
1048
+ "<|27.92|>": 51760,
1049
+ "<|27.94|>": 51761,
1050
+ "<|27.96|>": 51762,
1051
+ "<|27.98|>": 51763,
1052
+ "<|28.00|>": 51764,
1053
+ "<|28.02|>": 51765,
1054
+ "<|28.04|>": 51766,
1055
+ "<|28.06|>": 51767,
1056
+ "<|28.08|>": 51768,
1057
+ "<|28.10|>": 51769,
1058
+ "<|28.12|>": 51770,
1059
+ "<|28.14|>": 51771,
1060
+ "<|28.16|>": 51772,
1061
+ "<|28.18|>": 51773,
1062
+ "<|28.20|>": 51774,
1063
+ "<|28.22|>": 51775,
1064
+ "<|28.24|>": 51776,
1065
+ "<|28.26|>": 51777,
1066
+ "<|28.28|>": 51778,
1067
+ "<|28.30|>": 51779,
1068
+ "<|28.32|>": 51780,
1069
+ "<|28.34|>": 51781,
1070
+ "<|28.36|>": 51782,
1071
+ "<|28.38|>": 51783,
1072
+ "<|28.40|>": 51784,
1073
+ "<|28.42|>": 51785,
1074
+ "<|28.44|>": 51786,
1075
+ "<|28.46|>": 51787,
1076
+ "<|28.48|>": 51788,
1077
+ "<|28.50|>": 51789,
1078
+ "<|28.52|>": 51790,
1079
+ "<|28.54|>": 51791,
1080
+ "<|28.56|>": 51792,
1081
+ "<|28.58|>": 51793,
1082
+ "<|28.60|>": 51794,
1083
+ "<|28.62|>": 51795,
1084
+ "<|28.64|>": 51796,
1085
+ "<|28.66|>": 51797,
1086
+ "<|28.68|>": 51798,
1087
+ "<|28.70|>": 51799,
1088
+ "<|28.72|>": 51800,
1089
+ "<|28.74|>": 51801,
1090
+ "<|28.76|>": 51802,
1091
+ "<|28.78|>": 51803,
1092
+ "<|28.80|>": 51804,
1093
+ "<|28.82|>": 51805,
1094
+ "<|28.84|>": 51806,
1095
+ "<|28.86|>": 51807,
1096
+ "<|28.88|>": 51808,
1097
+ "<|28.90|>": 51809,
1098
+ "<|28.92|>": 51810,
1099
+ "<|28.94|>": 51811,
1100
+ "<|28.96|>": 51812,
1101
+ "<|28.98|>": 51813,
1102
+ "<|29.00|>": 51814,
1103
+ "<|29.02|>": 51815,
1104
+ "<|29.04|>": 51816,
1105
+ "<|29.06|>": 51817,
1106
+ "<|29.08|>": 51818,
1107
+ "<|29.10|>": 51819,
1108
+ "<|29.12|>": 51820,
1109
+ "<|29.14|>": 51821,
1110
+ "<|29.16|>": 51822,
1111
+ "<|29.18|>": 51823,
1112
+ "<|29.20|>": 51824,
1113
+ "<|29.22|>": 51825,
1114
+ "<|29.24|>": 51826,
1115
+ "<|29.26|>": 51827,
1116
+ "<|29.28|>": 51828,
1117
+ "<|29.30|>": 51829,
1118
+ "<|29.32|>": 51830,
1119
+ "<|29.34|>": 51831,
1120
+ "<|29.36|>": 51832,
1121
+ "<|29.38|>": 51833,
1122
+ "<|29.40|>": 51834,
1123
+ "<|29.42|>": 51835,
1124
+ "<|29.44|>": 51836,
1125
+ "<|29.46|>": 51837,
1126
+ "<|29.48|>": 51838,
1127
+ "<|29.50|>": 51839,
1128
+ "<|29.52|>": 51840,
1129
+ "<|29.54|>": 51841,
1130
+ "<|29.56|>": 51842,
1131
+ "<|29.58|>": 51843,
1132
+ "<|29.60|>": 51844,
1133
+ "<|29.62|>": 51845,
1134
+ "<|29.64|>": 51846,
1135
+ "<|29.66|>": 51847,
1136
+ "<|29.68|>": 51848,
1137
+ "<|29.70|>": 51849,
1138
+ "<|29.72|>": 51850,
1139
+ "<|29.74|>": 51851,
1140
+ "<|29.76|>": 51852,
1141
+ "<|29.78|>": 51853,
1142
+ "<|29.80|>": 51854,
1143
+ "<|29.82|>": 51855,
1144
+ "<|29.84|>": 51856,
1145
+ "<|29.86|>": 51857,
1146
+ "<|29.88|>": 51858,
1147
+ "<|29.90|>": 51859,
1148
+ "<|29.92|>": 51860,
1149
+ "<|29.94|>": 51861,
1150
+ "<|29.96|>": 51862,
1151
+ "<|29.98|>": 51863,
1152
+ "<|3.00|>": 50514,
1153
+ "<|3.02|>": 50515,
1154
+ "<|3.04|>": 50516,
1155
+ "<|3.06|>": 50517,
1156
+ "<|3.08|>": 50518,
1157
+ "<|3.10|>": 50519,
1158
+ "<|3.12|>": 50520,
1159
+ "<|3.14|>": 50521,
1160
+ "<|3.16|>": 50522,
1161
+ "<|3.18|>": 50523,
1162
+ "<|3.20|>": 50524,
1163
+ "<|3.22|>": 50525,
1164
+ "<|3.24|>": 50526,
1165
+ "<|3.26|>": 50527,
1166
+ "<|3.28|>": 50528,
1167
+ "<|3.30|>": 50529,
1168
+ "<|3.32|>": 50530,
1169
+ "<|3.34|>": 50531,
1170
+ "<|3.36|>": 50532,
1171
+ "<|3.38|>": 50533,
1172
+ "<|3.40|>": 50534,
1173
+ "<|3.42|>": 50535,
1174
+ "<|3.44|>": 50536,
1175
+ "<|3.46|>": 50537,
1176
+ "<|3.48|>": 50538,
1177
+ "<|3.50|>": 50539,
1178
+ "<|3.52|>": 50540,
1179
+ "<|3.54|>": 50541,
1180
+ "<|3.56|>": 50542,
1181
+ "<|3.58|>": 50543,
1182
+ "<|3.60|>": 50544,
1183
+ "<|3.62|>": 50545,
1184
+ "<|3.64|>": 50546,
1185
+ "<|3.66|>": 50547,
1186
+ "<|3.68|>": 50548,
1187
+ "<|3.70|>": 50549,
1188
+ "<|3.72|>": 50550,
1189
+ "<|3.74|>": 50551,
1190
+ "<|3.76|>": 50552,
1191
+ "<|3.78|>": 50553,
1192
+ "<|3.80|>": 50554,
1193
+ "<|3.82|>": 50555,
1194
+ "<|3.84|>": 50556,
1195
+ "<|3.86|>": 50557,
1196
+ "<|3.88|>": 50558,
1197
+ "<|3.90|>": 50559,
1198
+ "<|3.92|>": 50560,
1199
+ "<|3.94|>": 50561,
1200
+ "<|3.96|>": 50562,
1201
+ "<|3.98|>": 50563,
1202
+ "<|30.00|>": 51864,
1203
+ "<|4.00|>": 50564,
1204
+ "<|4.02|>": 50565,
1205
+ "<|4.04|>": 50566,
1206
+ "<|4.06|>": 50567,
1207
+ "<|4.08|>": 50568,
1208
+ "<|4.10|>": 50569,
1209
+ "<|4.12|>": 50570,
1210
+ "<|4.14|>": 50571,
1211
+ "<|4.16|>": 50572,
1212
+ "<|4.18|>": 50573,
1213
+ "<|4.20|>": 50574,
1214
+ "<|4.22|>": 50575,
1215
+ "<|4.24|>": 50576,
1216
+ "<|4.26|>": 50577,
1217
+ "<|4.28|>": 50578,
1218
+ "<|4.30|>": 50579,
1219
+ "<|4.32|>": 50580,
1220
+ "<|4.34|>": 50581,
1221
+ "<|4.36|>": 50582,
1222
+ "<|4.38|>": 50583,
1223
+ "<|4.40|>": 50584,
1224
+ "<|4.42|>": 50585,
1225
+ "<|4.44|>": 50586,
1226
+ "<|4.46|>": 50587,
1227
+ "<|4.48|>": 50588,
1228
+ "<|4.50|>": 50589,
1229
+ "<|4.52|>": 50590,
1230
+ "<|4.54|>": 50591,
1231
+ "<|4.56|>": 50592,
1232
+ "<|4.58|>": 50593,
1233
+ "<|4.60|>": 50594,
1234
+ "<|4.62|>": 50595,
1235
+ "<|4.64|>": 50596,
1236
+ "<|4.66|>": 50597,
1237
+ "<|4.68|>": 50598,
1238
+ "<|4.70|>": 50599,
1239
+ "<|4.72|>": 50600,
1240
+ "<|4.74|>": 50601,
1241
+ "<|4.76|>": 50602,
1242
+ "<|4.78|>": 50603,
1243
+ "<|4.80|>": 50604,
1244
+ "<|4.82|>": 50605,
1245
+ "<|4.84|>": 50606,
1246
+ "<|4.86|>": 50607,
1247
+ "<|4.88|>": 50608,
1248
+ "<|4.90|>": 50609,
1249
+ "<|4.92|>": 50610,
1250
+ "<|4.94|>": 50611,
1251
+ "<|4.96|>": 50612,
1252
+ "<|4.98|>": 50613,
1253
+ "<|5.00|>": 50614,
1254
+ "<|5.02|>": 50615,
1255
+ "<|5.04|>": 50616,
1256
+ "<|5.06|>": 50617,
1257
+ "<|5.08|>": 50618,
1258
+ "<|5.10|>": 50619,
1259
+ "<|5.12|>": 50620,
1260
+ "<|5.14|>": 50621,
1261
+ "<|5.16|>": 50622,
1262
+ "<|5.18|>": 50623,
1263
+ "<|5.20|>": 50624,
1264
+ "<|5.22|>": 50625,
1265
+ "<|5.24|>": 50626,
1266
+ "<|5.26|>": 50627,
1267
+ "<|5.28|>": 50628,
1268
+ "<|5.30|>": 50629,
1269
+ "<|5.32|>": 50630,
1270
+ "<|5.34|>": 50631,
1271
+ "<|5.36|>": 50632,
1272
+ "<|5.38|>": 50633,
1273
+ "<|5.40|>": 50634,
1274
+ "<|5.42|>": 50635,
1275
+ "<|5.44|>": 50636,
1276
+ "<|5.46|>": 50637,
1277
+ "<|5.48|>": 50638,
1278
+ "<|5.50|>": 50639,
1279
+ "<|5.52|>": 50640,
1280
+ "<|5.54|>": 50641,
1281
+ "<|5.56|>": 50642,
1282
+ "<|5.58|>": 50643,
1283
+ "<|5.60|>": 50644,
1284
+ "<|5.62|>": 50645,
1285
+ "<|5.64|>": 50646,
1286
+ "<|5.66|>": 50647,
1287
+ "<|5.68|>": 50648,
1288
+ "<|5.70|>": 50649,
1289
+ "<|5.72|>": 50650,
1290
+ "<|5.74|>": 50651,
1291
+ "<|5.76|>": 50652,
1292
+ "<|5.78|>": 50653,
1293
+ "<|5.80|>": 50654,
1294
+ "<|5.82|>": 50655,
1295
+ "<|5.84|>": 50656,
1296
+ "<|5.86|>": 50657,
1297
+ "<|5.88|>": 50658,
1298
+ "<|5.90|>": 50659,
1299
+ "<|5.92|>": 50660,
1300
+ "<|5.94|>": 50661,
1301
+ "<|5.96|>": 50662,
1302
+ "<|5.98|>": 50663,
1303
+ "<|6.00|>": 50664,
1304
+ "<|6.02|>": 50665,
1305
+ "<|6.04|>": 50666,
1306
+ "<|6.06|>": 50667,
1307
+ "<|6.08|>": 50668,
1308
+ "<|6.10|>": 50669,
1309
+ "<|6.12|>": 50670,
1310
+ "<|6.14|>": 50671,
1311
+ "<|6.16|>": 50672,
1312
+ "<|6.18|>": 50673,
1313
+ "<|6.20|>": 50674,
1314
+ "<|6.22|>": 50675,
1315
+ "<|6.24|>": 50676,
1316
+ "<|6.26|>": 50677,
1317
+ "<|6.28|>": 50678,
1318
+ "<|6.30|>": 50679,
1319
+ "<|6.32|>": 50680,
1320
+ "<|6.34|>": 50681,
1321
+ "<|6.36|>": 50682,
1322
+ "<|6.38|>": 50683,
1323
+ "<|6.40|>": 50684,
1324
+ "<|6.42|>": 50685,
1325
+ "<|6.44|>": 50686,
1326
+ "<|6.46|>": 50687,
1327
+ "<|6.48|>": 50688,
1328
+ "<|6.50|>": 50689,
1329
+ "<|6.52|>": 50690,
1330
+ "<|6.54|>": 50691,
1331
+ "<|6.56|>": 50692,
1332
+ "<|6.58|>": 50693,
1333
+ "<|6.60|>": 50694,
1334
+ "<|6.62|>": 50695,
1335
+ "<|6.64|>": 50696,
1336
+ "<|6.66|>": 50697,
1337
+ "<|6.68|>": 50698,
1338
+ "<|6.70|>": 50699,
1339
+ "<|6.72|>": 50700,
1340
+ "<|6.74|>": 50701,
1341
+ "<|6.76|>": 50702,
1342
+ "<|6.78|>": 50703,
1343
+ "<|6.80|>": 50704,
1344
+ "<|6.82|>": 50705,
1345
+ "<|6.84|>": 50706,
1346
+ "<|6.86|>": 50707,
1347
+ "<|6.88|>": 50708,
1348
+ "<|6.90|>": 50709,
1349
+ "<|6.92|>": 50710,
1350
+ "<|6.94|>": 50711,
1351
+ "<|6.96|>": 50712,
1352
+ "<|6.98|>": 50713,
1353
+ "<|7.00|>": 50714,
1354
+ "<|7.02|>": 50715,
1355
+ "<|7.04|>": 50716,
1356
+ "<|7.06|>": 50717,
1357
+ "<|7.08|>": 50718,
1358
+ "<|7.10|>": 50719,
1359
+ "<|7.12|>": 50720,
1360
+ "<|7.14|>": 50721,
1361
+ "<|7.16|>": 50722,
1362
+ "<|7.18|>": 50723,
1363
+ "<|7.20|>": 50724,
1364
+ "<|7.22|>": 50725,
1365
+ "<|7.24|>": 50726,
1366
+ "<|7.26|>": 50727,
1367
+ "<|7.28|>": 50728,
1368
+ "<|7.30|>": 50729,
1369
+ "<|7.32|>": 50730,
1370
+ "<|7.34|>": 50731,
1371
+ "<|7.36|>": 50732,
1372
+ "<|7.38|>": 50733,
1373
+ "<|7.40|>": 50734,
1374
+ "<|7.42|>": 50735,
1375
+ "<|7.44|>": 50736,
1376
+ "<|7.46|>": 50737,
1377
+ "<|7.48|>": 50738,
1378
+ "<|7.50|>": 50739,
1379
+ "<|7.52|>": 50740,
1380
+ "<|7.54|>": 50741,
1381
+ "<|7.56|>": 50742,
1382
+ "<|7.58|>": 50743,
1383
+ "<|7.60|>": 50744,
1384
+ "<|7.62|>": 50745,
1385
+ "<|7.64|>": 50746,
1386
+ "<|7.66|>": 50747,
1387
+ "<|7.68|>": 50748,
1388
+ "<|7.70|>": 50749,
1389
+ "<|7.72|>": 50750,
1390
+ "<|7.74|>": 50751,
1391
+ "<|7.76|>": 50752,
1392
+ "<|7.78|>": 50753,
1393
+ "<|7.80|>": 50754,
1394
+ "<|7.82|>": 50755,
1395
+ "<|7.84|>": 50756,
1396
+ "<|7.86|>": 50757,
1397
+ "<|7.88|>": 50758,
1398
+ "<|7.90|>": 50759,
1399
+ "<|7.92|>": 50760,
1400
+ "<|7.94|>": 50761,
1401
+ "<|7.96|>": 50762,
1402
+ "<|7.98|>": 50763,
1403
+ "<|8.00|>": 50764,
1404
+ "<|8.02|>": 50765,
1405
+ "<|8.04|>": 50766,
1406
+ "<|8.06|>": 50767,
1407
+ "<|8.08|>": 50768,
1408
+ "<|8.10|>": 50769,
1409
+ "<|8.12|>": 50770,
1410
+ "<|8.14|>": 50771,
1411
+ "<|8.16|>": 50772,
1412
+ "<|8.18|>": 50773,
1413
+ "<|8.20|>": 50774,
1414
+ "<|8.22|>": 50775,
1415
+ "<|8.24|>": 50776,
1416
+ "<|8.26|>": 50777,
1417
+ "<|8.28|>": 50778,
1418
+ "<|8.30|>": 50779,
1419
+ "<|8.32|>": 50780,
1420
+ "<|8.34|>": 50781,
1421
+ "<|8.36|>": 50782,
1422
+ "<|8.38|>": 50783,
1423
+ "<|8.40|>": 50784,
1424
+ "<|8.42|>": 50785,
1425
+ "<|8.44|>": 50786,
1426
+ "<|8.46|>": 50787,
1427
+ "<|8.48|>": 50788,
1428
+ "<|8.50|>": 50789,
1429
+ "<|8.52|>": 50790,
1430
+ "<|8.54|>": 50791,
1431
+ "<|8.56|>": 50792,
1432
+ "<|8.58|>": 50793,
1433
+ "<|8.60|>": 50794,
1434
+ "<|8.62|>": 50795,
1435
+ "<|8.64|>": 50796,
1436
+ "<|8.66|>": 50797,
1437
+ "<|8.68|>": 50798,
1438
+ "<|8.70|>": 50799,
1439
+ "<|8.72|>": 50800,
1440
+ "<|8.74|>": 50801,
1441
+ "<|8.76|>": 50802,
1442
+ "<|8.78|>": 50803,
1443
+ "<|8.80|>": 50804,
1444
+ "<|8.82|>": 50805,
1445
+ "<|8.84|>": 50806,
1446
+ "<|8.86|>": 50807,
1447
+ "<|8.88|>": 50808,
1448
+ "<|8.90|>": 50809,
1449
+ "<|8.92|>": 50810,
1450
+ "<|8.94|>": 50811,
1451
+ "<|8.96|>": 50812,
1452
+ "<|8.98|>": 50813,
1453
+ "<|9.00|>": 50814,
1454
+ "<|9.02|>": 50815,
1455
+ "<|9.04|>": 50816,
1456
+ "<|9.06|>": 50817,
1457
+ "<|9.08|>": 50818,
1458
+ "<|9.10|>": 50819,
1459
+ "<|9.12|>": 50820,
1460
+ "<|9.14|>": 50821,
1461
+ "<|9.16|>": 50822,
1462
+ "<|9.18|>": 50823,
1463
+ "<|9.20|>": 50824,
1464
+ "<|9.22|>": 50825,
1465
+ "<|9.24|>": 50826,
1466
+ "<|9.26|>": 50827,
1467
+ "<|9.28|>": 50828,
1468
+ "<|9.30|>": 50829,
1469
+ "<|9.32|>": 50830,
1470
+ "<|9.34|>": 50831,
1471
+ "<|9.36|>": 50832,
1472
+ "<|9.38|>": 50833,
1473
+ "<|9.40|>": 50834,
1474
+ "<|9.42|>": 50835,
1475
+ "<|9.44|>": 50836,
1476
+ "<|9.46|>": 50837,
1477
+ "<|9.48|>": 50838,
1478
+ "<|9.50|>": 50839,
1479
+ "<|9.52|>": 50840,
1480
+ "<|9.54|>": 50841,
1481
+ "<|9.56|>": 50842,
1482
+ "<|9.58|>": 50843,
1483
+ "<|9.60|>": 50844,
1484
+ "<|9.62|>": 50845,
1485
+ "<|9.64|>": 50846,
1486
+ "<|9.66|>": 50847,
1487
+ "<|9.68|>": 50848,
1488
+ "<|9.70|>": 50849,
1489
+ "<|9.72|>": 50850,
1490
+ "<|9.74|>": 50851,
1491
+ "<|9.76|>": 50852,
1492
+ "<|9.78|>": 50853,
1493
+ "<|9.80|>": 50854,
1494
+ "<|9.82|>": 50855,
1495
+ "<|9.84|>": 50856,
1496
+ "<|9.86|>": 50857,
1497
+ "<|9.88|>": 50858,
1498
+ "<|9.90|>": 50859,
1499
+ "<|9.92|>": 50860,
1500
+ "<|9.94|>": 50861,
1501
+ "<|9.96|>": 50862,
1502
+ "<|9.98|>": 50863,
1503
+ "<|af|>": 50327,
1504
+ "<|am|>": 50334,
1505
+ "<|ar|>": 50272,
1506
+ "<|as|>": 50350,
1507
+ "<|az|>": 50304,
1508
+ "<|ba|>": 50355,
1509
+ "<|be|>": 50330,
1510
+ "<|bg|>": 50292,
1511
+ "<|bn|>": 50302,
1512
+ "<|bo|>": 50347,
1513
+ "<|br|>": 50309,
1514
+ "<|bs|>": 50315,
1515
+ "<|ca|>": 50270,
1516
+ "<|cs|>": 50283,
1517
+ "<|cy|>": 50297,
1518
+ "<|da|>": 50285,
1519
+ "<|de|>": 50261,
1520
+ "<|el|>": 50281,
1521
+ "<|en|>": 50259,
1522
+ "<|es|>": 50262,
1523
+ "<|et|>": 50307,
1524
+ "<|eu|>": 50310,
1525
+ "<|fa|>": 50300,
1526
+ "<|fi|>": 50277,
1527
+ "<|fo|>": 50338,
1528
+ "<|fr|>": 50265,
1529
+ "<|gl|>": 50319,
1530
+ "<|gu|>": 50333,
1531
+ "<|haw|>": 50352,
1532
+ "<|ha|>": 50354,
1533
+ "<|he|>": 50279,
1534
+ "<|hi|>": 50276,
1535
+ "<|hr|>": 50291,
1536
+ "<|ht|>": 50339,
1537
+ "<|hu|>": 50286,
1538
+ "<|hy|>": 50312,
1539
+ "<|id|>": 50275,
1540
+ "<|is|>": 50311,
1541
+ "<|it|>": 50274,
1542
+ "<|ja|>": 50266,
1543
+ "<|jw|>": 50356,
1544
+ "<|ka|>": 50329,
1545
+ "<|kk|>": 50316,
1546
+ "<|km|>": 50323,
1547
+ "<|kn|>": 50306,
1548
+ "<|ko|>": 50264,
1549
+ "<|la|>": 50294,
1550
+ "<|lb|>": 50345,
1551
+ "<|ln|>": 50353,
1552
+ "<|lo|>": 50336,
1553
+ "<|lt|>": 50293,
1554
+ "<|lv|>": 50301,
1555
+ "<|mg|>": 50349,
1556
+ "<|mi|>": 50295,
1557
+ "<|mk|>": 50308,
1558
+ "<|ml|>": 50296,
1559
+ "<|mn|>": 50314,
1560
+ "<|mr|>": 50320,
1561
+ "<|ms|>": 50282,
1562
+ "<|mt|>": 50343,
1563
+ "<|my|>": 50346,
1564
+ "<|ne|>": 50313,
1565
+ "<|nl|>": 50271,
1566
+ "<|nn|>": 50342,
1567
+ "<|nocaptions|>": 50362,
1568
+ "<|notimestamps|>": 50363,
1569
+ "<|no|>": 50288,
1570
+ "<|oc|>": 50328,
1571
+ "<|pa|>": 50321,
1572
+ "<|pl|>": 50269,
1573
+ "<|ps|>": 50340,
1574
+ "<|pt|>": 50267,
1575
+ "<|ro|>": 50284,
1576
+ "<|ru|>": 50263,
1577
+ "<|sa|>": 50344,
1578
+ "<|sd|>": 50332,
1579
+ "<|si|>": 50322,
1580
+ "<|sk|>": 50298,
1581
+ "<|sl|>": 50305,
1582
+ "<|sn|>": 50324,
1583
+ "<|so|>": 50326,
1584
+ "<|sq|>": 50317,
1585
+ "<|sr|>": 50303,
1586
+ "<|startoflm|>": 50360,
1587
+ "<|startofprev|>": 50361,
1588
+ "<|startoftranscript|>": 50258,
1589
+ "<|su|>": 50357,
1590
+ "<|sv|>": 50273,
1591
+ "<|sw|>": 50318,
1592
+ "<|ta|>": 50287,
1593
+ "<|te|>": 50299,
1594
+ "<|tg|>": 50331,
1595
+ "<|th|>": 50289,
1596
+ "<|tk|>": 50341,
1597
+ "<|tl|>": 50348,
1598
+ "<|transcribe|>": 50359,
1599
+ "<|translate|>": 50358,
1600
+ "<|tr|>": 50268,
1601
+ "<|tt|>": 50351,
1602
+ "<|uk|>": 50280,
1603
+ "<|ur|>": 50290,
1604
+ "<|uz|>": 50337,
1605
+ "<|vi|>": 50278,
1606
+ "<|yi|>": 50335,
1607
+ "<|yo|>": 50325,
1608
+ "<|zh|>": 50260
1609
+ }
all_results.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 2.999774113395076,
3
+ "eval_loss": 0.08800012618303299,
4
+ "eval_runtime": 89.1047,
5
+ "eval_samples": 4263,
6
+ "eval_samples_per_second": 47.843,
7
+ "eval_steps_per_second": 0.752,
8
+ "eval_wer": 0.0960365529699288,
9
+ "total_flos": 2.756290459511145e+20,
10
+ "train_loss": 0.20740618139383307,
11
+ "train_runtime": 51184.2268,
12
+ "train_samples": 1416604,
13
+ "train_samples_per_second": 83.03,
14
+ "train_steps_per_second": 0.649
15
+ }
config.json ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-base",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": null,
11
+ "bos_token_id": 50257,
12
+ "classifier_proj_size": 256,
13
+ "d_model": 512,
14
+ "decoder_attention_heads": 8,
15
+ "decoder_ffn_dim": 2048,
16
+ "decoder_layerdrop": 0.0,
17
+ "decoder_layers": 6,
18
+ "decoder_start_token_id": 50258,
19
+ "dropout": 0.0,
20
+ "encoder_attention_heads": 8,
21
+ "encoder_ffn_dim": 2048,
22
+ "encoder_layerdrop": 0.0,
23
+ "encoder_layers": 6,
24
+ "eos_token_id": 50257,
25
+ "forced_decoder_ids": null,
26
+ "init_std": 0.02,
27
+ "is_encoder_decoder": true,
28
+ "mask_feature_length": 10,
29
+ "mask_feature_min_masks": 0,
30
+ "mask_feature_prob": 0.0,
31
+ "mask_time_length": 10,
32
+ "mask_time_min_masks": 2,
33
+ "mask_time_prob": 0.05,
34
+ "max_length": null,
35
+ "max_source_positions": 1500,
36
+ "max_target_positions": 448,
37
+ "median_filter_width": 7,
38
+ "model_type": "whisper",
39
+ "num_hidden_layers": 6,
40
+ "num_mel_bins": 80,
41
+ "pad_token_id": 50257,
42
+ "scale_embedding": false,
43
+ "torch_dtype": "float32",
44
+ "transformers_version": "4.48.0.dev0",
45
+ "use_cache": true,
46
+ "use_weighted_layer_sum": false,
47
+ "vocab_size": 51865
48
+ }
eval_results.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 2.999774113395076,
3
+ "eval_loss": 0.08800012618303299,
4
+ "eval_runtime": 89.1047,
5
+ "eval_samples": 4263,
6
+ "eval_samples_per_second": 47.843,
7
+ "eval_steps_per_second": 0.752,
8
+ "eval_wer": 0.0960365529699288
9
+ }
generation_config.json ADDED
@@ -0,0 +1,248 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 3,
5
+ 1
6
+ ],
7
+ [
8
+ 4,
9
+ 2
10
+ ],
11
+ [
12
+ 4,
13
+ 3
14
+ ],
15
+ [
16
+ 4,
17
+ 7
18
+ ],
19
+ [
20
+ 5,
21
+ 1
22
+ ],
23
+ [
24
+ 5,
25
+ 2
26
+ ],
27
+ [
28
+ 5,
29
+ 4
30
+ ],
31
+ [
32
+ 5,
33
+ 6
34
+ ]
35
+ ],
36
+ "begin_suppress_tokens": [
37
+ 220,
38
+ 50257
39
+ ],
40
+ "bos_token_id": 50257,
41
+ "decoder_start_token_id": 50258,
42
+ "eos_token_id": 50257,
43
+ "is_multilingual": true,
44
+ "lang_to_id": {
45
+ "<|af|>": 50327,
46
+ "<|am|>": 50334,
47
+ "<|ar|>": 50272,
48
+ "<|as|>": 50350,
49
+ "<|az|>": 50304,
50
+ "<|ba|>": 50355,
51
+ "<|be|>": 50330,
52
+ "<|bg|>": 50292,
53
+ "<|bn|>": 50302,
54
+ "<|bo|>": 50347,
55
+ "<|br|>": 50309,
56
+ "<|bs|>": 50315,
57
+ "<|ca|>": 50270,
58
+ "<|cs|>": 50283,
59
+ "<|cy|>": 50297,
60
+ "<|da|>": 50285,
61
+ "<|de|>": 50261,
62
+ "<|el|>": 50281,
63
+ "<|en|>": 50259,
64
+ "<|es|>": 50262,
65
+ "<|et|>": 50307,
66
+ "<|eu|>": 50310,
67
+ "<|fa|>": 50300,
68
+ "<|fi|>": 50277,
69
+ "<|fo|>": 50338,
70
+ "<|fr|>": 50265,
71
+ "<|gl|>": 50319,
72
+ "<|gu|>": 50333,
73
+ "<|haw|>": 50352,
74
+ "<|ha|>": 50354,
75
+ "<|he|>": 50279,
76
+ "<|hi|>": 50276,
77
+ "<|hr|>": 50291,
78
+ "<|ht|>": 50339,
79
+ "<|hu|>": 50286,
80
+ "<|hy|>": 50312,
81
+ "<|id|>": 50275,
82
+ "<|is|>": 50311,
83
+ "<|it|>": 50274,
84
+ "<|ja|>": 50266,
85
+ "<|jw|>": 50356,
86
+ "<|ka|>": 50329,
87
+ "<|kk|>": 50316,
88
+ "<|km|>": 50323,
89
+ "<|kn|>": 50306,
90
+ "<|ko|>": 50264,
91
+ "<|la|>": 50294,
92
+ "<|lb|>": 50345,
93
+ "<|ln|>": 50353,
94
+ "<|lo|>": 50336,
95
+ "<|lt|>": 50293,
96
+ "<|lv|>": 50301,
97
+ "<|mg|>": 50349,
98
+ "<|mi|>": 50295,
99
+ "<|mk|>": 50308,
100
+ "<|ml|>": 50296,
101
+ "<|mn|>": 50314,
102
+ "<|mr|>": 50320,
103
+ "<|ms|>": 50282,
104
+ "<|mt|>": 50343,
105
+ "<|my|>": 50346,
106
+ "<|ne|>": 50313,
107
+ "<|nl|>": 50271,
108
+ "<|nn|>": 50342,
109
+ "<|no|>": 50288,
110
+ "<|oc|>": 50328,
111
+ "<|pa|>": 50321,
112
+ "<|pl|>": 50269,
113
+ "<|ps|>": 50340,
114
+ "<|pt|>": 50267,
115
+ "<|ro|>": 50284,
116
+ "<|ru|>": 50263,
117
+ "<|sa|>": 50344,
118
+ "<|sd|>": 50332,
119
+ "<|si|>": 50322,
120
+ "<|sk|>": 50298,
121
+ "<|sl|>": 50305,
122
+ "<|sn|>": 50324,
123
+ "<|so|>": 50326,
124
+ "<|sq|>": 50317,
125
+ "<|sr|>": 50303,
126
+ "<|su|>": 50357,
127
+ "<|sv|>": 50273,
128
+ "<|sw|>": 50318,
129
+ "<|ta|>": 50287,
130
+ "<|te|>": 50299,
131
+ "<|tg|>": 50331,
132
+ "<|th|>": 50289,
133
+ "<|tk|>": 50341,
134
+ "<|tl|>": 50348,
135
+ "<|tr|>": 50268,
136
+ "<|tt|>": 50351,
137
+ "<|uk|>": 50280,
138
+ "<|ur|>": 50290,
139
+ "<|uz|>": 50337,
140
+ "<|vi|>": 50278,
141
+ "<|yi|>": 50335,
142
+ "<|yo|>": 50325,
143
+ "<|zh|>": 50260
144
+ },
145
+ "language": "hungarian",
146
+ "max_initial_timestamp_index": 50,
147
+ "max_length": 448,
148
+ "no_timestamps_token_id": 50363,
149
+ "pad_token_id": 50257,
150
+ "prev_sot_token_id": 50361,
151
+ "return_timestamps": false,
152
+ "suppress_tokens": [
153
+ 1,
154
+ 2,
155
+ 7,
156
+ 8,
157
+ 9,
158
+ 10,
159
+ 14,
160
+ 25,
161
+ 26,
162
+ 27,
163
+ 28,
164
+ 29,
165
+ 31,
166
+ 58,
167
+ 59,
168
+ 60,
169
+ 61,
170
+ 62,
171
+ 63,
172
+ 90,
173
+ 91,
174
+ 92,
175
+ 93,
176
+ 359,
177
+ 503,
178
+ 522,
179
+ 542,
180
+ 873,
181
+ 893,
182
+ 902,
183
+ 918,
184
+ 922,
185
+ 931,
186
+ 1350,
187
+ 1853,
188
+ 1982,
189
+ 2460,
190
+ 2627,
191
+ 3246,
192
+ 3253,
193
+ 3268,
194
+ 3536,
195
+ 3846,
196
+ 3961,
197
+ 4183,
198
+ 4667,
199
+ 6585,
200
+ 6647,
201
+ 7273,
202
+ 9061,
203
+ 9383,
204
+ 10428,
205
+ 10929,
206
+ 11938,
207
+ 12033,
208
+ 12331,
209
+ 12562,
210
+ 13793,
211
+ 14157,
212
+ 14635,
213
+ 15265,
214
+ 15618,
215
+ 16553,
216
+ 16604,
217
+ 18362,
218
+ 18956,
219
+ 20075,
220
+ 21675,
221
+ 22520,
222
+ 26130,
223
+ 26161,
224
+ 26435,
225
+ 28279,
226
+ 29464,
227
+ 31650,
228
+ 32302,
229
+ 32470,
230
+ 36865,
231
+ 42863,
232
+ 47425,
233
+ 49870,
234
+ 50254,
235
+ 50258,
236
+ 50358,
237
+ 50359,
238
+ 50360,
239
+ 50361,
240
+ 50362
241
+ ],
242
+ "task": "transcribe",
243
+ "task_to_id": {
244
+ "transcribe": 50359,
245
+ "translate": 50358
246
+ },
247
+ "transformers_version": "4.48.0.dev0"
248
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b45d8d61ba92b567a0f1a5b248db6a0e29625cf5d34171bcc750900ad9f94491
3
+ size 290403936
normalizer.json ADDED
@@ -0,0 +1,1742 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "accessorise": "accessorize",
3
+ "accessorised": "accessorized",
4
+ "accessorises": "accessorizes",
5
+ "accessorising": "accessorizing",
6
+ "acclimatisation": "acclimatization",
7
+ "acclimatise": "acclimatize",
8
+ "acclimatised": "acclimatized",
9
+ "acclimatises": "acclimatizes",
10
+ "acclimatising": "acclimatizing",
11
+ "accoutrements": "accouterments",
12
+ "aeon": "eon",
13
+ "aeons": "eons",
14
+ "aerogramme": "aerogram",
15
+ "aerogrammes": "aerograms",
16
+ "aeroplane": "airplane",
17
+ "aeroplanes": "airplanes",
18
+ "aesthete": "esthete",
19
+ "aesthetes": "esthetes",
20
+ "aesthetic": "esthetic",
21
+ "aesthetically": "esthetically",
22
+ "aesthetics": "esthetics",
23
+ "aetiology": "etiology",
24
+ "ageing": "aging",
25
+ "aggrandisement": "aggrandizement",
26
+ "agonise": "agonize",
27
+ "agonised": "agonized",
28
+ "agonises": "agonizes",
29
+ "agonising": "agonizing",
30
+ "agonisingly": "agonizingly",
31
+ "almanack": "almanac",
32
+ "almanacks": "almanacs",
33
+ "aluminium": "aluminum",
34
+ "amortisable": "amortizable",
35
+ "amortisation": "amortization",
36
+ "amortisations": "amortizations",
37
+ "amortise": "amortize",
38
+ "amortised": "amortized",
39
+ "amortises": "amortizes",
40
+ "amortising": "amortizing",
41
+ "amphitheatre": "amphitheater",
42
+ "amphitheatres": "amphitheaters",
43
+ "anaemia": "anemia",
44
+ "anaemic": "anemic",
45
+ "anaesthesia": "anesthesia",
46
+ "anaesthetic": "anesthetic",
47
+ "anaesthetics": "anesthetics",
48
+ "anaesthetise": "anesthetize",
49
+ "anaesthetised": "anesthetized",
50
+ "anaesthetises": "anesthetizes",
51
+ "anaesthetising": "anesthetizing",
52
+ "anaesthetist": "anesthetist",
53
+ "anaesthetists": "anesthetists",
54
+ "anaesthetize": "anesthetize",
55
+ "anaesthetized": "anesthetized",
56
+ "anaesthetizes": "anesthetizes",
57
+ "anaesthetizing": "anesthetizing",
58
+ "analogue": "analog",
59
+ "analogues": "analogs",
60
+ "analyse": "analyze",
61
+ "analysed": "analyzed",
62
+ "analyses": "analyzes",
63
+ "analysing": "analyzing",
64
+ "anglicise": "anglicize",
65
+ "anglicised": "anglicized",
66
+ "anglicises": "anglicizes",
67
+ "anglicising": "anglicizing",
68
+ "annualised": "annualized",
69
+ "antagonise": "antagonize",
70
+ "antagonised": "antagonized",
71
+ "antagonises": "antagonizes",
72
+ "antagonising": "antagonizing",
73
+ "apologise": "apologize",
74
+ "apologised": "apologized",
75
+ "apologises": "apologizes",
76
+ "apologising": "apologizing",
77
+ "appal": "appall",
78
+ "appals": "appalls",
79
+ "appetiser": "appetizer",
80
+ "appetisers": "appetizers",
81
+ "appetising": "appetizing",
82
+ "appetisingly": "appetizingly",
83
+ "arbour": "arbor",
84
+ "arbours": "arbors",
85
+ "archaeologically": "archeologically",
86
+ "archaeologist": "archeologist",
87
+ "archaeologists": "archeologists",
88
+ "archaeology": "archeology</span>",
89
+ "archeological": "archaeological",
90
+ "ardour": "ardor",
91
+ "armour": "armor",
92
+ "armoured": "armored",
93
+ "armourer": "armorer",
94
+ "armourers": "armorers",
95
+ "armouries": "armories",
96
+ "armoury": "armory",
97
+ "artefact": "artifact",
98
+ "artefacts": "artifacts",
99
+ "authorise": "authorize",
100
+ "authorised": "authorized",
101
+ "authorises": "authorizes",
102
+ "authorising": "authorizing",
103
+ "axe": "ax",
104
+ "backpedalled": "backpedaled",
105
+ "backpedalling": "backpedaling",
106
+ "bannister": "banister",
107
+ "bannisters": "banisters",
108
+ "baptise": "baptize",
109
+ "baptised": "baptized",
110
+ "baptises": "baptizes",
111
+ "baptising": "baptizing",
112
+ "bastardise": "bastardize",
113
+ "bastardised": "bastardized",
114
+ "bastardises": "bastardizes",
115
+ "bastardising": "bastardizing",
116
+ "battleax": "battleaxe",
117
+ "baulk": "balk",
118
+ "baulked": "balked",
119
+ "baulking": "balking",
120
+ "baulks": "balks",
121
+ "bedevilled": "bedeviled",
122
+ "bedevilling": "bedeviling",
123
+ "behaviour": "behavior",
124
+ "behavioural": "behavioral",
125
+ "behaviourism": "behaviorism",
126
+ "behaviourist": "behaviorist",
127
+ "behaviourists": "behaviorists",
128
+ "behaviours": "behaviors",
129
+ "behove": "behoove",
130
+ "behoved": "behooved",
131
+ "behoves": "behooves",
132
+ "bejewelled": "bejeweled",
133
+ "belabour": "belabor",
134
+ "belaboured": "belabored",
135
+ "belabouring": "belaboring",
136
+ "belabours": "belabors",
137
+ "bevelled": "beveled",
138
+ "bevvies": "bevies",
139
+ "bevvy": "bevy",
140
+ "biassed": "biased",
141
+ "biassing": "biasing",
142
+ "bingeing": "binging",
143
+ "bougainvillaea": "bougainvillea",
144
+ "bougainvillaeas": "bougainvilleas",
145
+ "bowdlerise": "bowdlerize",
146
+ "bowdlerised": "bowdlerized",
147
+ "bowdlerises": "bowdlerizes",
148
+ "bowdlerising": "bowdlerizing",
149
+ "breathalyse": "breathalyze",
150
+ "breathalysed": "breathalyzed",
151
+ "breathalyser": "breathalyzer",
152
+ "breathalysers": "breathalyzers",
153
+ "breathalyses": "breathalyzes",
154
+ "breathalysing": "breathalyzing",
155
+ "brutalise": "brutalize",
156
+ "brutalised": "brutalized",
157
+ "brutalises": "brutalizes",
158
+ "brutalising": "brutalizing",
159
+ "busses": "buses",
160
+ "bussing": "busing",
161
+ "caesarean": "cesarean",
162
+ "caesareans": "cesareans",
163
+ "calibre": "caliber",
164
+ "calibres": "calibers",
165
+ "calliper": "caliper",
166
+ "callipers": "calipers",
167
+ "callisthenics": "calisthenics",
168
+ "canalise": "canalize",
169
+ "canalised": "canalized",
170
+ "canalises": "canalizes",
171
+ "canalising": "canalizing",
172
+ "cancelation": "cancellation",
173
+ "cancelations": "cancellations",
174
+ "cancelled": "canceled",
175
+ "cancelling": "canceling",
176
+ "candour": "candor",
177
+ "cannibalise": "cannibalize",
178
+ "cannibalised": "cannibalized",
179
+ "cannibalises": "cannibalizes",
180
+ "cannibalising": "cannibalizing",
181
+ "canonise": "canonize",
182
+ "canonised": "canonized",
183
+ "canonises": "canonizes",
184
+ "canonising": "canonizing",
185
+ "capitalise": "capitalize",
186
+ "capitalised": "capitalized",
187
+ "capitalises": "capitalizes",
188
+ "capitalising": "capitalizing",
189
+ "caramelise": "caramelize",
190
+ "caramelised": "caramelized",
191
+ "caramelises": "caramelizes",
192
+ "caramelising": "caramelizing",
193
+ "carbonise": "carbonize",
194
+ "carbonised": "carbonized",
195
+ "carbonises": "carbonizes",
196
+ "carbonising": "carbonizing",
197
+ "carolled": "caroled",
198
+ "carolling": "caroling",
199
+ "catalogue": "catalog",
200
+ "catalogued": "cataloged",
201
+ "catalogues": "catalogs",
202
+ "cataloguing": "cataloging",
203
+ "catalyse": "catalyze",
204
+ "catalysed": "catalyzed",
205
+ "catalyses": "catalyzes",
206
+ "catalysing": "catalyzing",
207
+ "categorise": "categorize",
208
+ "categorised": "categorized",
209
+ "categorises": "categorizes",
210
+ "categorising": "categorizing",
211
+ "cauterise": "cauterize",
212
+ "cauterised": "cauterized",
213
+ "cauterises": "cauterizes",
214
+ "cauterising": "cauterizing",
215
+ "cavilled": "caviled",
216
+ "cavilling": "caviling",
217
+ "centigramme": "centigram",
218
+ "centigrammes": "centigrams",
219
+ "centilitre": "centiliter",
220
+ "centilitres": "centiliters",
221
+ "centimetre": "centimeter",
222
+ "centimetres": "centimeters",
223
+ "centralise": "centralize",
224
+ "centralised": "centralized",
225
+ "centralises": "centralizes",
226
+ "centralising": "centralizing",
227
+ "centre": "center",
228
+ "centred": "centered",
229
+ "centrefold": "centerfold",
230
+ "centrefolds": "centerfolds",
231
+ "centrepiece": "centerpiece",
232
+ "centrepieces": "centerpieces",
233
+ "centres": "centers",
234
+ "channelled": "channeled",
235
+ "channelling": "channeling",
236
+ "characterise": "characterize",
237
+ "characterised": "characterized",
238
+ "characterises": "characterizes",
239
+ "characterising": "characterizing",
240
+ "cheque": "check",
241
+ "chequebook": "checkbook",
242
+ "chequebooks": "checkbooks",
243
+ "chequered": "checkered",
244
+ "cheques": "checks",
245
+ "chilli": "chili",
246
+ "chimaera": "chimera",
247
+ "chimaeras": "chimeras",
248
+ "chiselled": "chiseled",
249
+ "chiselling": "chiseling",
250
+ "circularise": "circularize",
251
+ "circularised": "circularized",
252
+ "circularises": "circularizes",
253
+ "circularising": "circularizing",
254
+ "civilise": "civilize",
255
+ "civilised": "civilized",
256
+ "civilises": "civilizes",
257
+ "civilising": "civilizing",
258
+ "clamour": "clamor",
259
+ "clamoured": "clamored",
260
+ "clamouring": "clamoring",
261
+ "clamours": "clamors",
262
+ "clangour": "clangor",
263
+ "clarinettist": "clarinetist",
264
+ "clarinettists": "clarinetists",
265
+ "collectivise": "collectivize",
266
+ "collectivised": "collectivized",
267
+ "collectivises": "collectivizes",
268
+ "collectivising": "collectivizing",
269
+ "colonisation": "colonization",
270
+ "colonise": "colonize",
271
+ "colonised": "colonized",
272
+ "coloniser": "colonizer",
273
+ "colonisers": "colonizers",
274
+ "colonises": "colonizes",
275
+ "colonising": "colonizing",
276
+ "colour": "color",
277
+ "colourant": "colorant",
278
+ "colourants": "colorants",
279
+ "coloured": "colored",
280
+ "coloureds": "coloreds",
281
+ "colourful": "colorful",
282
+ "colourfully": "colorfully",
283
+ "colouring": "coloring",
284
+ "colourize": "colorize",
285
+ "colourized": "colorized",
286
+ "colourizes": "colorizes",
287
+ "colourizing": "colorizing",
288
+ "colourless": "colorless",
289
+ "colours": "colors",
290
+ "commercialise": "commercialize",
291
+ "commercialised": "commercialized",
292
+ "commercialises": "commercializes",
293
+ "commercialising": "commercializing",
294
+ "compartmentalise": "compartmentalize",
295
+ "compartmentalised": "compartmentalized",
296
+ "compartmentalises": "compartmentalizes",
297
+ "compartmentalising": "compartmentalizing",
298
+ "computerise": "computerize",
299
+ "computerised": "computerized",
300
+ "computerises": "computerizes",
301
+ "computerising": "computerizing",
302
+ "conceptualise": "conceptualize",
303
+ "conceptualised": "conceptualized",
304
+ "conceptualises": "conceptualizes",
305
+ "conceptualising": "conceptualizing",
306
+ "connexion": "connection",
307
+ "connexions": "connections",
308
+ "contextualise": "contextualize",
309
+ "contextualised": "contextualized",
310
+ "contextualises": "contextualizes",
311
+ "contextualising": "contextualizing",
312
+ "cosier": "cozier",
313
+ "cosies": "cozies",
314
+ "cosiest": "coziest",
315
+ "cosily": "cozily",
316
+ "cosiness": "coziness",
317
+ "cosy": "cozy",
318
+ "councillor": "councilor",
319
+ "councillors": "councilors",
320
+ "counselled": "counseled",
321
+ "counselling": "counseling",
322
+ "counsellor": "counselor",
323
+ "counsellors": "counselors",
324
+ "crenelated": "crenellated",
325
+ "criminalise": "criminalize",
326
+ "criminalised": "criminalized",
327
+ "criminalises": "criminalizes",
328
+ "criminalising": "criminalizing",
329
+ "criticise": "criticize",
330
+ "criticised": "criticized",
331
+ "criticises": "criticizes",
332
+ "criticising": "criticizing",
333
+ "crueller": "crueler",
334
+ "cruellest": "cruelest",
335
+ "crystallisation": "crystallization",
336
+ "crystallise": "crystallize",
337
+ "crystallised": "crystallized",
338
+ "crystallises": "crystallizes",
339
+ "crystallising": "crystallizing",
340
+ "cudgelled": "cudgeled",
341
+ "cudgelling": "cudgeling",
342
+ "customise": "customize",
343
+ "customised": "customized",
344
+ "customises": "customizes",
345
+ "customising": "customizing",
346
+ "cypher": "cipher",
347
+ "cyphers": "ciphers",
348
+ "decentralisation": "decentralization",
349
+ "decentralise": "decentralize",
350
+ "decentralised": "decentralized",
351
+ "decentralises": "decentralizes",
352
+ "decentralising": "decentralizing",
353
+ "decriminalisation": "decriminalization",
354
+ "decriminalise": "decriminalize",
355
+ "decriminalised": "decriminalized",
356
+ "decriminalises": "decriminalizes",
357
+ "decriminalising": "decriminalizing",
358
+ "defence": "defense",
359
+ "defenceless": "defenseless",
360
+ "defences": "defenses",
361
+ "dehumanisation": "dehumanization",
362
+ "dehumanise": "dehumanize",
363
+ "dehumanised": "dehumanized",
364
+ "dehumanises": "dehumanizes",
365
+ "dehumanising": "dehumanizing",
366
+ "demeanour": "demeanor",
367
+ "demilitarisation": "demilitarization",
368
+ "demilitarise": "demilitarize",
369
+ "demilitarised": "demilitarized",
370
+ "demilitarises": "demilitarizes",
371
+ "demilitarising": "demilitarizing",
372
+ "demobilisation": "demobilization",
373
+ "demobilise": "demobilize",
374
+ "demobilised": "demobilized",
375
+ "demobilises": "demobilizes",
376
+ "demobilising": "demobilizing",
377
+ "democratisation": "democratization",
378
+ "democratise": "democratize",
379
+ "democratised": "democratized",
380
+ "democratises": "democratizes",
381
+ "democratising": "democratizing",
382
+ "demonise": "demonize",
383
+ "demonised": "demonized",
384
+ "demonises": "demonizes",
385
+ "demonising": "demonizing",
386
+ "demoralisation": "demoralization",
387
+ "demoralise": "demoralize",
388
+ "demoralised": "demoralized",
389
+ "demoralises": "demoralizes",
390
+ "demoralising": "demoralizing",
391
+ "denationalisation": "denationalization",
392
+ "denationalise": "denationalize",
393
+ "denationalised": "denationalized",
394
+ "denationalises": "denationalizes",
395
+ "denationalising": "denationalizing",
396
+ "deodorise": "deodorize",
397
+ "deodorised": "deodorized",
398
+ "deodorises": "deodorizes",
399
+ "deodorising": "deodorizing",
400
+ "depersonalise": "depersonalize",
401
+ "depersonalised": "depersonalized",
402
+ "depersonalises": "depersonalizes",
403
+ "depersonalising": "depersonalizing",
404
+ "deputise": "deputize",
405
+ "deputised": "deputized",
406
+ "deputises": "deputizes",
407
+ "deputising": "deputizing",
408
+ "desensitisation": "desensitization",
409
+ "desensitise": "desensitize",
410
+ "desensitised": "desensitized",
411
+ "desensitises": "desensitizes",
412
+ "desensitising": "desensitizing",
413
+ "destabilisation": "destabilization",
414
+ "destabilise": "destabilize",
415
+ "destabilised": "destabilized",
416
+ "destabilises": "destabilizes",
417
+ "destabilising": "destabilizing",
418
+ "dialled": "dialed",
419
+ "dialling": "dialing",
420
+ "dialogue": "dialog",
421
+ "dialogues": "dialogs",
422
+ "diarrhoea": "diarrhea",
423
+ "digitise": "digitize",
424
+ "digitised": "digitized",
425
+ "digitises": "digitizes",
426
+ "digitising": "digitizing",
427
+ "disc": "disk",
428
+ "discolour": "discolor",
429
+ "discoloured": "discolored",
430
+ "discolouring": "discoloring",
431
+ "discolours": "discolors",
432
+ "discs": "disks",
433
+ "disembowelled": "disemboweled",
434
+ "disembowelling": "disemboweling",
435
+ "disfavour": "disfavor",
436
+ "dishevelled": "disheveled",
437
+ "dishonour": "dishonor",
438
+ "dishonourable": "dishonorable",
439
+ "dishonourably": "dishonorably",
440
+ "dishonoured": "dishonored",
441
+ "dishonouring": "dishonoring",
442
+ "dishonours": "dishonors",
443
+ "disorganisation": "disorganization",
444
+ "disorganised": "disorganized",
445
+ "distil": "distill",
446
+ "distils": "distills",
447
+ "dramatisation": "dramatization",
448
+ "dramatisations": "dramatizations",
449
+ "dramatise": "dramatize",
450
+ "dramatised": "dramatized",
451
+ "dramatises": "dramatizes",
452
+ "dramatising": "dramatizing",
453
+ "draught": "draft",
454
+ "draughtboard": "draftboard",
455
+ "draughtboards": "draftboards",
456
+ "draughtier": "draftier",
457
+ "draughtiest": "draftiest",
458
+ "draughts": "drafts",
459
+ "draughtsman": "draftsman",
460
+ "draughtsmanship": "draftsmanship",
461
+ "draughtsmen": "draftsmen",
462
+ "draughtswoman": "draftswoman",
463
+ "draughtswomen": "draftswomen",
464
+ "draughty": "drafty",
465
+ "drivelled": "driveled",
466
+ "drivelling": "driveling",
467
+ "duelled": "dueled",
468
+ "duelling": "dueling",
469
+ "economise": "economize",
470
+ "economised": "economized",
471
+ "economises": "economizes",
472
+ "economising": "economizing",
473
+ "editorialise": "editorialize",
474
+ "editorialised": "editorialized",
475
+ "editorialises": "editorializes",
476
+ "editorialising": "editorializing",
477
+ "edoema": "edema",
478
+ "empathise": "empathize",
479
+ "empathised": "empathized",
480
+ "empathises": "empathizes",
481
+ "empathising": "empathizing",
482
+ "emphasise": "emphasize",
483
+ "emphasised": "emphasized",
484
+ "emphasises": "emphasizes",
485
+ "emphasising": "emphasizing",
486
+ "enamelled": "enameled",
487
+ "enamelling": "enameling",
488
+ "enamoured": "enamored",
489
+ "encyclopaedia": "encyclopedia",
490
+ "encyclopaedias": "encyclopedias",
491
+ "encyclopaedic": "encyclopedic",
492
+ "endeavour": "endeavor",
493
+ "endeavoured": "endeavored",
494
+ "endeavouring": "endeavoring",
495
+ "endeavours": "endeavors",
496
+ "energise": "energize",
497
+ "energised": "energized",
498
+ "energises": "energizes",
499
+ "energising": "energizing",
500
+ "enrol": "enroll",
501
+ "enrols": "enrolls",
502
+ "enthral": "enthrall",
503
+ "enthrals": "enthralls",
504
+ "epaulette": "epaulet",
505
+ "epaulettes": "epaulets",
506
+ "epicentre": "epicenter",
507
+ "epicentres": "epicenters",
508
+ "epilogue": "epilog",
509
+ "epilogues": "epilogs",
510
+ "epitomise": "epitomize",
511
+ "epitomised": "epitomized",
512
+ "epitomises": "epitomizes",
513
+ "epitomising": "epitomizing",
514
+ "equalisation": "equalization",
515
+ "equalise": "equalize",
516
+ "equalised": "equalized",
517
+ "equaliser": "equalizer",
518
+ "equalisers": "equalizers",
519
+ "equalises": "equalizes",
520
+ "equalising": "equalizing",
521
+ "eulogise": "eulogize",
522
+ "eulogised": "eulogized",
523
+ "eulogises": "eulogizes",
524
+ "eulogising": "eulogizing",
525
+ "evangelise": "evangelize",
526
+ "evangelised": "evangelized",
527
+ "evangelises": "evangelizes",
528
+ "evangelising": "evangelizing",
529
+ "exorcise": "exorcize",
530
+ "exorcised": "exorcized",
531
+ "exorcises": "exorcizes",
532
+ "exorcising": "exorcizing",
533
+ "extemporisation": "extemporization",
534
+ "extemporise": "extemporize",
535
+ "extemporised": "extemporized",
536
+ "extemporises": "extemporizes",
537
+ "extemporising": "extemporizing",
538
+ "externalisation": "externalization",
539
+ "externalisations": "externalizations",
540
+ "externalise": "externalize",
541
+ "externalised": "externalized",
542
+ "externalises": "externalizes",
543
+ "externalising": "externalizing",
544
+ "factorise": "factorize",
545
+ "factorised": "factorized",
546
+ "factorises": "factorizes",
547
+ "factorising": "factorizing",
548
+ "faecal": "fecal",
549
+ "faeces": "feces",
550
+ "familiarisation": "familiarization",
551
+ "familiarise": "familiarize",
552
+ "familiarised": "familiarized",
553
+ "familiarises": "familiarizes",
554
+ "familiarising": "familiarizing",
555
+ "fantasise": "fantasize",
556
+ "fantasised": "fantasized",
557
+ "fantasises": "fantasizes",
558
+ "fantasising": "fantasizing",
559
+ "favour": "favor",
560
+ "favourable": "favorable",
561
+ "favourably": "favorably",
562
+ "favoured": "favored",
563
+ "favouring": "favoring",
564
+ "favourite": "favorite",
565
+ "favourites": "favorites",
566
+ "favouritism": "favoritism",
567
+ "favours": "favors",
568
+ "feminise": "feminize",
569
+ "feminised": "feminized",
570
+ "feminises": "feminizes",
571
+ "feminising": "feminizing",
572
+ "fertilisation": "fertilization",
573
+ "fertilise": "fertilize",
574
+ "fertilised": "fertilized",
575
+ "fertiliser": "fertilizer",
576
+ "fertilisers": "fertilizers",
577
+ "fertilises": "fertilizes",
578
+ "fertilising": "fertilizing",
579
+ "fervour": "fervor",
580
+ "fibre": "fiber",
581
+ "fibreglass": "fiberglass",
582
+ "fibres": "fibers",
583
+ "fictionalisation": "fictionalization",
584
+ "fictionalisations": "fictionalizations",
585
+ "fictionalise": "fictionalize",
586
+ "fictionalised": "fictionalized",
587
+ "fictionalises": "fictionalizes",
588
+ "fictionalising": "fictionalizing",
589
+ "fillet": "filet",
590
+ "filleted": "fileted",
591
+ "filleting": "fileting",
592
+ "fillets": "filets",
593
+ "finalisation": "finalization",
594
+ "finalise": "finalize",
595
+ "finalised": "finalized",
596
+ "finalises": "finalizes",
597
+ "finalising": "finalizing",
598
+ "flautist": "flutist",
599
+ "flautists": "flutists",
600
+ "flavour": "flavor",
601
+ "flavoured": "flavored",
602
+ "flavouring": "flavoring",
603
+ "flavourings": "flavorings",
604
+ "flavourless": "flavorless",
605
+ "flavours": "flavors",
606
+ "flavoursome": "flavorsome",
607
+ "flyer / flier": "flier / flyer",
608
+ "foetal": "fetal",
609
+ "foetid": "fetid",
610
+ "foetus": "fetus",
611
+ "foetuses": "fetuses",
612
+ "formalisation": "formalization",
613
+ "formalise": "formalize",
614
+ "formalised": "formalized",
615
+ "formalises": "formalizes",
616
+ "formalising": "formalizing",
617
+ "fossilisation": "fossilization",
618
+ "fossilise": "fossilize",
619
+ "fossilised": "fossilized",
620
+ "fossilises": "fossilizes",
621
+ "fossilising": "fossilizing",
622
+ "fraternisation": "fraternization",
623
+ "fraternise": "fraternize",
624
+ "fraternised": "fraternized",
625
+ "fraternises": "fraternizes",
626
+ "fraternising": "fraternizing",
627
+ "fulfil": "fulfill",
628
+ "fulfilment": "fulfillment",
629
+ "fulfils": "fulfills",
630
+ "funnelled": "funneled",
631
+ "funnelling": "funneling",
632
+ "gage": "gauge",
633
+ "gaged": "gauged",
634
+ "gages": "gauges",
635
+ "gaging": "gauging",
636
+ "galvanise": "galvanize",
637
+ "galvanised": "galvanized",
638
+ "galvanises": "galvanizes",
639
+ "galvanising": "galvanizing",
640
+ "gambolled": "gamboled",
641
+ "gambolling": "gamboling",
642
+ "gaol": "jail",
643
+ "gaolbird": "jailbird",
644
+ "gaolbirds": "jailbirds",
645
+ "gaolbreak": "jailbreak",
646
+ "gaolbreaks": "jailbreaks",
647
+ "gaoled": "jailed",
648
+ "gaoler": "jailer",
649
+ "gaolers": "jailers",
650
+ "gaoling": "jailing",
651
+ "gaols": "jails",
652
+ "gasses": "gases",
653
+ "generalisation": "generalization",
654
+ "generalisations": "generalizations",
655
+ "generalise": "generalize",
656
+ "generalised": "generalized",
657
+ "generalises": "generalizes",
658
+ "generalising": "generalizing",
659
+ "ghettoise": "ghettoize",
660
+ "ghettoised": "ghettoized",
661
+ "ghettoises": "ghettoizes",
662
+ "ghettoising": "ghettoizing",
663
+ "gipsies": "gypsies",
664
+ "glamor": "glamour",
665
+ "glamorise": "glamorize",
666
+ "glamorised": "glamorized",
667
+ "glamorises": "glamorizes",
668
+ "glamorising": "glamorizing",
669
+ "globalisation": "globalization",
670
+ "globalise": "globalize",
671
+ "globalised": "globalized",
672
+ "globalises": "globalizes",
673
+ "globalising": "globalizing",
674
+ "glueing": "gluing",
675
+ "goitre": "goiter",
676
+ "goitres": "goiters",
677
+ "gonorrhoea": "gonorrhea",
678
+ "gramme": "gram",
679
+ "grammes": "grams",
680
+ "gravelled": "graveled",
681
+ "grey": "gray",
682
+ "greyed": "grayed",
683
+ "greying": "graying",
684
+ "greyish": "grayish",
685
+ "greyness": "grayness",
686
+ "greys": "grays",
687
+ "grovelled": "groveled",
688
+ "grovelling": "groveling",
689
+ "groyne": "groin",
690
+ "groynes": "groins",
691
+ "gruelling": "grueling",
692
+ "gruellingly": "gruelingly",
693
+ "gryphon": "griffin",
694
+ "gryphons": "griffins",
695
+ "gynaecological": "gynecological",
696
+ "gynaecologist": "gynecologist",
697
+ "gynaecologists": "gynecologists",
698
+ "gynaecology": "gynecology",
699
+ "haematological": "hematological",
700
+ "haematologist": "hematologist",
701
+ "haematologists": "hematologists",
702
+ "haematology": "hematology",
703
+ "haemoglobin": "hemoglobin",
704
+ "haemophilia": "hemophilia",
705
+ "haemophiliac": "hemophiliac",
706
+ "haemophiliacs": "hemophiliacs",
707
+ "haemorrhage": "hemorrhage",
708
+ "haemorrhaged": "hemorrhaged",
709
+ "haemorrhages": "hemorrhages",
710
+ "haemorrhaging": "hemorrhaging",
711
+ "haemorrhoids": "hemorrhoids",
712
+ "harbour": "harbor",
713
+ "harboured": "harbored",
714
+ "harbouring": "harboring",
715
+ "harbours": "harbors",
716
+ "harmonisation": "harmonization",
717
+ "harmonise": "harmonize",
718
+ "harmonised": "harmonized",
719
+ "harmonises": "harmonizes",
720
+ "harmonising": "harmonizing",
721
+ "homoeopath": "homeopath",
722
+ "homoeopathic": "homeopathic",
723
+ "homoeopaths": "homeopaths",
724
+ "homoeopathy": "homeopathy",
725
+ "homogenise": "homogenize",
726
+ "homogenised": "homogenized",
727
+ "homogenises": "homogenizes",
728
+ "homogenising": "homogenizing",
729
+ "honour": "honor",
730
+ "honourable": "honorable",
731
+ "honourably": "honorably",
732
+ "honoured": "honored",
733
+ "honouring": "honoring",
734
+ "honours": "honors",
735
+ "hospitalisation": "hospitalization",
736
+ "hospitalise": "hospitalize",
737
+ "hospitalised": "hospitalized",
738
+ "hospitalises": "hospitalizes",
739
+ "hospitalising": "hospitalizing",
740
+ "humanise": "humanize",
741
+ "humanised": "humanized",
742
+ "humanises": "humanizes",
743
+ "humanising": "humanizing",
744
+ "humour": "humor",
745
+ "humoured": "humored",
746
+ "humouring": "humoring",
747
+ "humourless": "humorless",
748
+ "humours": "humors",
749
+ "hybridise": "hybridize",
750
+ "hybridised": "hybridized",
751
+ "hybridises": "hybridizes",
752
+ "hybridising": "hybridizing",
753
+ "hypnotise": "hypnotize",
754
+ "hypnotised": "hypnotized",
755
+ "hypnotises": "hypnotizes",
756
+ "hypnotising": "hypnotizing",
757
+ "hypothesise": "hypothesize",
758
+ "hypothesised": "hypothesized",
759
+ "hypothesises": "hypothesizes",
760
+ "hypothesising": "hypothesizing",
761
+ "idealisation": "idealization",
762
+ "idealise": "idealize",
763
+ "idealised": "idealized",
764
+ "idealises": "idealizes",
765
+ "idealising": "idealizing",
766
+ "idolise": "idolize",
767
+ "idolised": "idolized",
768
+ "idolises": "idolizes",
769
+ "idolising": "idolizing",
770
+ "immobilisation": "immobilization",
771
+ "immobilise": "immobilize",
772
+ "immobilised": "immobilized",
773
+ "immobiliser": "immobilizer",
774
+ "immobilisers": "immobilizers",
775
+ "immobilises": "immobilizes",
776
+ "immobilising": "immobilizing",
777
+ "immortalise": "immortalize",
778
+ "immortalised": "immortalized",
779
+ "immortalises": "immortalizes",
780
+ "immortalising": "immortalizing",
781
+ "immunisation": "immunization",
782
+ "immunise": "immunize",
783
+ "immunised": "immunized",
784
+ "immunises": "immunizes",
785
+ "immunising": "immunizing",
786
+ "impanelled": "impaneled",
787
+ "impanelling": "impaneling",
788
+ "imperilled": "imperiled",
789
+ "imperilling": "imperiling",
790
+ "individualise": "individualize",
791
+ "individualised": "individualized",
792
+ "individualises": "individualizes",
793
+ "individualising": "individualizing",
794
+ "industrialise": "industrialize",
795
+ "industrialised": "industrialized",
796
+ "industrialises": "industrializes",
797
+ "industrialising": "industrializing",
798
+ "inflexion": "inflection",
799
+ "inflexions": "inflections",
800
+ "initialise": "initialize",
801
+ "initialised": "initialized",
802
+ "initialises": "initializes",
803
+ "initialising": "initializing",
804
+ "initialled": "initialed",
805
+ "initialling": "initialing",
806
+ "instal": "install",
807
+ "instalment": "installment",
808
+ "instalments": "installments",
809
+ "instals": "installs",
810
+ "instil": "instill",
811
+ "instils": "instills",
812
+ "institutionalisation": "institutionalization",
813
+ "institutionalise": "institutionalize",
814
+ "institutionalised": "institutionalized",
815
+ "institutionalises": "institutionalizes",
816
+ "institutionalising": "institutionalizing",
817
+ "intellectualise": "intellectualize",
818
+ "intellectualised": "intellectualized",
819
+ "intellectualises": "intellectualizes",
820
+ "intellectualising": "intellectualizing",
821
+ "internalisation": "internalization",
822
+ "internalise": "internalize",
823
+ "internalised": "internalized",
824
+ "internalises": "internalizes",
825
+ "internalising": "internalizing",
826
+ "internationalisation": "internationalization",
827
+ "internationalise": "internationalize",
828
+ "internationalised": "internationalized",
829
+ "internationalises": "internationalizes",
830
+ "internationalising": "internationalizing",
831
+ "ionisation": "ionization",
832
+ "ionise": "ionize",
833
+ "ionised": "ionized",
834
+ "ioniser": "ionizer",
835
+ "ionisers": "ionizers",
836
+ "ionises": "ionizes",
837
+ "ionising": "ionizing",
838
+ "italicise": "italicize",
839
+ "italicised": "italicized",
840
+ "italicises": "italicizes",
841
+ "italicising": "italicizing",
842
+ "itemise": "itemize",
843
+ "itemised": "itemized",
844
+ "itemises": "itemizes",
845
+ "itemising": "itemizing",
846
+ "jeopardise": "jeopardize",
847
+ "jeopardised": "jeopardized",
848
+ "jeopardises": "jeopardizes",
849
+ "jeopardising": "jeopardizing",
850
+ "jewelled": "jeweled",
851
+ "jeweller": "jeweler",
852
+ "jewellers": "jewelers",
853
+ "jewellery": "jewelry",
854
+ "judgement": "judgment",
855
+ "kilogramme": "kilogram",
856
+ "kilogrammes": "kilograms",
857
+ "kilometre": "kilometer",
858
+ "kilometres": "kilometers",
859
+ "labelled": "labeled",
860
+ "labelling": "labeling",
861
+ "labour": "labor",
862
+ "laboured": "labored",
863
+ "labourer": "laborer",
864
+ "labourers": "laborers",
865
+ "labouring": "laboring",
866
+ "labours": "labors",
867
+ "lacklustre": "lackluster",
868
+ "legalisation": "legalization",
869
+ "legalise": "legalize",
870
+ "legalised": "legalized",
871
+ "legalises": "legalizes",
872
+ "legalising": "legalizing",
873
+ "legitimise": "legitimize",
874
+ "legitimised": "legitimized",
875
+ "legitimises": "legitimizes",
876
+ "legitimising": "legitimizing",
877
+ "leukaemia": "leukemia",
878
+ "levelled": "leveled",
879
+ "leveller": "leveler",
880
+ "levellers": "levelers",
881
+ "levelling": "leveling",
882
+ "libelled": "libeled",
883
+ "libelling": "libeling",
884
+ "libellous": "libelous",
885
+ "liberalisation": "liberalization",
886
+ "liberalise": "liberalize",
887
+ "liberalised": "liberalized",
888
+ "liberalises": "liberalizes",
889
+ "liberalising": "liberalizing",
890
+ "licence": "license",
891
+ "licenced": "licensed",
892
+ "licences": "licenses",
893
+ "licencing": "licensing",
894
+ "likeable": "likable",
895
+ "lionisation": "lionization",
896
+ "lionise": "lionize",
897
+ "lionised": "lionized",
898
+ "lionises": "lionizes",
899
+ "lionising": "lionizing",
900
+ "liquidise": "liquidize",
901
+ "liquidised": "liquidized",
902
+ "liquidiser": "liquidizer",
903
+ "liquidisers": "liquidizers",
904
+ "liquidises": "liquidizes",
905
+ "liquidising": "liquidizing",
906
+ "litre": "liter",
907
+ "litres": "liters",
908
+ "localise": "localize",
909
+ "localised": "localized",
910
+ "localises": "localizes",
911
+ "localising": "localizing",
912
+ "louvre": "louver",
913
+ "louvred": "louvered",
914
+ "louvres": "louvers",
915
+ "lustre": "luster",
916
+ "magnetise": "magnetize",
917
+ "magnetised": "magnetized",
918
+ "magnetises": "magnetizes",
919
+ "magnetising": "magnetizing",
920
+ "manoeuvrability": "maneuverability",
921
+ "manoeuvrable": "maneuverable",
922
+ "manoeuvre": "maneuver",
923
+ "manoeuvred": "maneuvered",
924
+ "manoeuvres": "maneuvers",
925
+ "manoeuvring": "maneuvering",
926
+ "manoeuvrings": "maneuverings",
927
+ "marginalisation": "marginalization",
928
+ "marginalise": "marginalize",
929
+ "marginalised": "marginalized",
930
+ "marginalises": "marginalizes",
931
+ "marginalising": "marginalizing",
932
+ "marshalled": "marshaled",
933
+ "marshalling": "marshaling",
934
+ "marvelled": "marveled",
935
+ "marvelling": "marveling",
936
+ "marvellous": "marvelous",
937
+ "marvellously": "marvelously",
938
+ "materialisation": "materialization",
939
+ "materialise": "materialize",
940
+ "materialised": "materialized",
941
+ "materialises": "materializes",
942
+ "materialising": "materializing",
943
+ "maximisation": "maximization",
944
+ "maximise": "maximize",
945
+ "maximised": "maximized",
946
+ "maximises": "maximizes",
947
+ "maximising": "maximizing",
948
+ "meagre": "meager",
949
+ "mechanisation": "mechanization",
950
+ "mechanise": "mechanize",
951
+ "mechanised": "mechanized",
952
+ "mechanises": "mechanizes",
953
+ "mechanising": "mechanizing",
954
+ "mediaeval": "medieval",
955
+ "memorialise": "memorialize",
956
+ "memorialised": "memorialized",
957
+ "memorialises": "memorializes",
958
+ "memorialising": "memorializing",
959
+ "memorise": "memorize",
960
+ "memorised": "memorized",
961
+ "memorises": "memorizes",
962
+ "memorising": "memorizing",
963
+ "mesmerise": "mesmerize",
964
+ "mesmerised": "mesmerized",
965
+ "mesmerises": "mesmerizes",
966
+ "mesmerising": "mesmerizing",
967
+ "metabolise": "metabolize",
968
+ "metabolised": "metabolized",
969
+ "metabolises": "metabolizes",
970
+ "metabolising": "metabolizing",
971
+ "metre": "meter",
972
+ "metres": "meters",
973
+ "mhm": "hmm",
974
+ "micrometre": "micrometer",
975
+ "micrometres": "micrometers",
976
+ "militarise": "militarize",
977
+ "militarised": "militarized",
978
+ "militarises": "militarizes",
979
+ "militarising": "militarizing",
980
+ "milligramme": "milligram",
981
+ "milligrammes": "milligrams",
982
+ "millilitre": "milliliter",
983
+ "millilitres": "milliliters",
984
+ "millimetre": "millimeter",
985
+ "millimetres": "millimeters",
986
+ "miniaturisation": "miniaturization",
987
+ "miniaturise": "miniaturize",
988
+ "miniaturised": "miniaturized",
989
+ "miniaturises": "miniaturizes",
990
+ "miniaturising": "miniaturizing",
991
+ "minibusses": "minibuses",
992
+ "minimise": "minimize",
993
+ "minimised": "minimized",
994
+ "minimises": "minimizes",
995
+ "minimising": "minimizing",
996
+ "misbehaviour": "misbehavior",
997
+ "misdemeanour": "misdemeanor",
998
+ "misdemeanours": "misdemeanors",
999
+ "misspelt": "misspelled",
1000
+ "mitre": "miter",
1001
+ "mitres": "miters",
1002
+ "mm": "hmm",
1003
+ "mmm": "hmm",
1004
+ "mobilisation": "mobilization",
1005
+ "mobilise": "mobilize",
1006
+ "mobilised": "mobilized",
1007
+ "mobilises": "mobilizes",
1008
+ "mobilising": "mobilizing",
1009
+ "modelled": "modeled",
1010
+ "modeller": "modeler",
1011
+ "modellers": "modelers",
1012
+ "modelling": "modeling",
1013
+ "modernise": "modernize",
1014
+ "modernised": "modernized",
1015
+ "modernises": "modernizes",
1016
+ "modernising": "modernizing",
1017
+ "moisturise": "moisturize",
1018
+ "moisturised": "moisturized",
1019
+ "moisturiser": "moisturizer",
1020
+ "moisturisers": "moisturizers",
1021
+ "moisturises": "moisturizes",
1022
+ "moisturising": "moisturizing",
1023
+ "monologue": "monolog",
1024
+ "monologues": "monologs",
1025
+ "monopolisation": "monopolization",
1026
+ "monopolise": "monopolize",
1027
+ "monopolised": "monopolized",
1028
+ "monopolises": "monopolizes",
1029
+ "monopolising": "monopolizing",
1030
+ "moralise": "moralize",
1031
+ "moralised": "moralized",
1032
+ "moralises": "moralizes",
1033
+ "moralising": "moralizing",
1034
+ "motorised": "motorized",
1035
+ "mould": "mold",
1036
+ "moulded": "molded",
1037
+ "moulder": "molder",
1038
+ "mouldered": "moldered",
1039
+ "mouldering": "moldering",
1040
+ "moulders": "molders",
1041
+ "mouldier": "moldier",
1042
+ "mouldiest": "moldiest",
1043
+ "moulding": "molding",
1044
+ "mouldings": "moldings",
1045
+ "moulds": "molds",
1046
+ "mouldy": "moldy",
1047
+ "moult": "molt",
1048
+ "moulted": "molted",
1049
+ "moulting": "molting",
1050
+ "moults": "molts",
1051
+ "moustache": "mustache",
1052
+ "moustached": "mustached",
1053
+ "moustaches": "mustaches",
1054
+ "moustachioed": "mustachioed",
1055
+ "multicoloured": "multicolored",
1056
+ "nationalisation": "nationalization",
1057
+ "nationalisations": "nationalizations",
1058
+ "nationalise": "nationalize",
1059
+ "nationalised": "nationalized",
1060
+ "nationalises": "nationalizes",
1061
+ "nationalising": "nationalizing",
1062
+ "naturalisation": "naturalization",
1063
+ "naturalise": "naturalize",
1064
+ "naturalised": "naturalized",
1065
+ "naturalises": "naturalizes",
1066
+ "naturalising": "naturalizing",
1067
+ "neighbour": "neighbor",
1068
+ "neighbourhood": "neighborhood",
1069
+ "neighbourhoods": "neighborhoods",
1070
+ "neighbouring": "neighboring",
1071
+ "neighbourliness": "neighborliness",
1072
+ "neighbourly": "neighborly",
1073
+ "neighbours": "neighbors",
1074
+ "neutralisation": "neutralization",
1075
+ "neutralise": "neutralize",
1076
+ "neutralised": "neutralized",
1077
+ "neutralises": "neutralizes",
1078
+ "neutralising": "neutralizing",
1079
+ "normalisation": "normalization",
1080
+ "normalise": "normalize",
1081
+ "normalised": "normalized",
1082
+ "normalises": "normalizes",
1083
+ "normalising": "normalizing",
1084
+ "odour": "odor",
1085
+ "odourless": "odorless",
1086
+ "odours": "odors",
1087
+ "oesophagus": "esophagus",
1088
+ "oesophaguses": "esophaguses",
1089
+ "oestrogen": "estrogen",
1090
+ "offence": "offense",
1091
+ "offences": "offenses",
1092
+ "omelette": "omelet",
1093
+ "omelettes": "omelets",
1094
+ "optimise": "optimize",
1095
+ "optimised": "optimized",
1096
+ "optimises": "optimizes",
1097
+ "optimising": "optimizing",
1098
+ "organisation": "organization",
1099
+ "organisational": "organizational",
1100
+ "organisations": "organizations",
1101
+ "organise": "organize",
1102
+ "organised": "organized",
1103
+ "organiser": "organizer",
1104
+ "organisers": "organizers",
1105
+ "organises": "organizes",
1106
+ "organising": "organizing",
1107
+ "orthopaedic": "orthopedic",
1108
+ "orthopaedics": "orthopedics",
1109
+ "ostracise": "ostracize",
1110
+ "ostracised": "ostracized",
1111
+ "ostracises": "ostracizes",
1112
+ "ostracising": "ostracizing",
1113
+ "outmanoeuvre": "outmaneuver",
1114
+ "outmanoeuvred": "outmaneuvered",
1115
+ "outmanoeuvres": "outmaneuvers",
1116
+ "outmanoeuvring": "outmaneuvering",
1117
+ "overemphasise": "overemphasize",
1118
+ "overemphasised": "overemphasized",
1119
+ "overemphasises": "overemphasizes",
1120
+ "overemphasising": "overemphasizing",
1121
+ "oxidisation": "oxidization",
1122
+ "oxidise": "oxidize",
1123
+ "oxidised": "oxidized",
1124
+ "oxidises": "oxidizes",
1125
+ "oxidising": "oxidizing",
1126
+ "paederast": "pederast",
1127
+ "paederasts": "pederasts",
1128
+ "paediatric": "pediatric",
1129
+ "paediatrician": "pediatrician",
1130
+ "paediatricians": "pediatricians",
1131
+ "paediatrics": "pediatrics",
1132
+ "paedophile": "pedophile",
1133
+ "paedophiles": "pedophiles",
1134
+ "paedophilia": "pedophilia",
1135
+ "palaeolithic": "paleolithic",
1136
+ "palaeontologist": "paleontologist",
1137
+ "palaeontologists": "paleontologists",
1138
+ "palaeontology": "paleontology",
1139
+ "panelled": "paneled",
1140
+ "panelling": "paneling",
1141
+ "panellist": "panelist",
1142
+ "panellists": "panelists",
1143
+ "paralyse": "paralyze",
1144
+ "paralysed": "paralyzed",
1145
+ "paralyses": "paralyzes",
1146
+ "paralysing": "paralyzing",
1147
+ "parcelled": "parceled",
1148
+ "parcelling": "parceling",
1149
+ "parlour": "parlor",
1150
+ "parlours": "parlors",
1151
+ "particularise": "particularize",
1152
+ "particularised": "particularized",
1153
+ "particularises": "particularizes",
1154
+ "particularising": "particularizing",
1155
+ "passivisation": "passivization",
1156
+ "passivise": "passivize",
1157
+ "passivised": "passivized",
1158
+ "passivises": "passivizes",
1159
+ "passivising": "passivizing",
1160
+ "pasteurisation": "pasteurization",
1161
+ "pasteurise": "pasteurize",
1162
+ "pasteurised": "pasteurized",
1163
+ "pasteurises": "pasteurizes",
1164
+ "pasteurising": "pasteurizing",
1165
+ "patronise": "patronize",
1166
+ "patronised": "patronized",
1167
+ "patronises": "patronizes",
1168
+ "patronising": "patronizing",
1169
+ "patronisingly": "patronizingly",
1170
+ "pedalled": "pedaled",
1171
+ "pedalling": "pedaling",
1172
+ "pedestrianisation": "pedestrianization",
1173
+ "pedestrianise": "pedestrianize",
1174
+ "pedestrianised": "pedestrianized",
1175
+ "pedestrianises": "pedestrianizes",
1176
+ "pedestrianising": "pedestrianizing",
1177
+ "penalise": "penalize",
1178
+ "penalised": "penalized",
1179
+ "penalises": "penalizes",
1180
+ "penalising": "penalizing",
1181
+ "pencilled": "penciled",
1182
+ "pencilling": "penciling",
1183
+ "personalise": "personalize",
1184
+ "personalised": "personalized",
1185
+ "personalises": "personalizes",
1186
+ "personalising": "personalizing",
1187
+ "pharmacopoeia": "pharmacopeia",
1188
+ "pharmacopoeias": "pharmacopeias",
1189
+ "philosophise": "philosophize",
1190
+ "philosophised": "philosophized",
1191
+ "philosophises": "philosophizes",
1192
+ "philosophising": "philosophizing",
1193
+ "philtre": "filter",
1194
+ "philtres": "filters",
1195
+ "phoney": "phony",
1196
+ "plagiarise": "plagiarize",
1197
+ "plagiarised": "plagiarized",
1198
+ "plagiarises": "plagiarizes",
1199
+ "plagiarising": "plagiarizing",
1200
+ "plough": "plow",
1201
+ "ploughed": "plowed",
1202
+ "ploughing": "plowing",
1203
+ "ploughman": "plowman",
1204
+ "ploughmen": "plowmen",
1205
+ "ploughs": "plows",
1206
+ "ploughshare": "plowshare",
1207
+ "ploughshares": "plowshares",
1208
+ "polarisation": "polarization",
1209
+ "polarise": "polarize",
1210
+ "polarised": "polarized",
1211
+ "polarises": "polarizes",
1212
+ "polarising": "polarizing",
1213
+ "politicisation": "politicization",
1214
+ "politicise": "politicize",
1215
+ "politicised": "politicized",
1216
+ "politicises": "politicizes",
1217
+ "politicising": "politicizing",
1218
+ "popularisation": "popularization",
1219
+ "popularise": "popularize",
1220
+ "popularised": "popularized",
1221
+ "popularises": "popularizes",
1222
+ "popularising": "popularizing",
1223
+ "pouffe": "pouf",
1224
+ "pouffes": "poufs",
1225
+ "practise": "practice",
1226
+ "practised": "practiced",
1227
+ "practises": "practices",
1228
+ "practising": "practicing",
1229
+ "praesidium": "presidium",
1230
+ "praesidiums": "presidiums",
1231
+ "pressurisation": "pressurization",
1232
+ "pressurise": "pressurize",
1233
+ "pressurised": "pressurized",
1234
+ "pressurises": "pressurizes",
1235
+ "pressurising": "pressurizing",
1236
+ "pretence": "pretense",
1237
+ "pretences": "pretenses",
1238
+ "primaeval": "primeval",
1239
+ "prioritisation": "prioritization",
1240
+ "prioritise": "prioritize",
1241
+ "prioritised": "prioritized",
1242
+ "prioritises": "prioritizes",
1243
+ "prioritising": "prioritizing",
1244
+ "privatisation": "privatization",
1245
+ "privatisations": "privatizations",
1246
+ "privatise": "privatize",
1247
+ "privatised": "privatized",
1248
+ "privatises": "privatizes",
1249
+ "privatising": "privatizing",
1250
+ "professionalisation": "professionalization",
1251
+ "professionalise": "professionalize",
1252
+ "professionalised": "professionalized",
1253
+ "professionalises": "professionalizes",
1254
+ "professionalising": "professionalizing",
1255
+ "programme": "program",
1256
+ "programmes": "programs",
1257
+ "prologue": "prolog",
1258
+ "prologues": "prologs",
1259
+ "propagandise": "propagandize",
1260
+ "propagandised": "propagandized",
1261
+ "propagandises": "propagandizes",
1262
+ "propagandising": "propagandizing",
1263
+ "proselytise": "proselytize",
1264
+ "proselytised": "proselytized",
1265
+ "proselytiser": "proselytizer",
1266
+ "proselytisers": "proselytizers",
1267
+ "proselytises": "proselytizes",
1268
+ "proselytising": "proselytizing",
1269
+ "psychoanalyse": "psychoanalyze",
1270
+ "psychoanalysed": "psychoanalyzed",
1271
+ "psychoanalyses": "psychoanalyzes",
1272
+ "psychoanalysing": "psychoanalyzing",
1273
+ "publicise": "publicize",
1274
+ "publicised": "publicized",
1275
+ "publicises": "publicizes",
1276
+ "publicising": "publicizing",
1277
+ "pulverisation": "pulverization",
1278
+ "pulverise": "pulverize",
1279
+ "pulverised": "pulverized",
1280
+ "pulverises": "pulverizes",
1281
+ "pulverising": "pulverizing",
1282
+ "pummelled": "pummel",
1283
+ "pummelling": "pummeled",
1284
+ "pyjama": "pajama",
1285
+ "pyjamas": "pajamas",
1286
+ "pzazz": "pizzazz",
1287
+ "quarrelled": "quarreled",
1288
+ "quarrelling": "quarreling",
1289
+ "radicalise": "radicalize",
1290
+ "radicalised": "radicalized",
1291
+ "radicalises": "radicalizes",
1292
+ "radicalising": "radicalizing",
1293
+ "rancour": "rancor",
1294
+ "randomise": "randomize",
1295
+ "randomised": "randomized",
1296
+ "randomises": "randomizes",
1297
+ "randomising": "randomizing",
1298
+ "rationalisation": "rationalization",
1299
+ "rationalisations": "rationalizations",
1300
+ "rationalise": "rationalize",
1301
+ "rationalised": "rationalized",
1302
+ "rationalises": "rationalizes",
1303
+ "rationalising": "rationalizing",
1304
+ "ravelled": "raveled",
1305
+ "ravelling": "raveling",
1306
+ "realisable": "realizable",
1307
+ "realisation": "realization",
1308
+ "realisations": "realizations",
1309
+ "realise": "realize",
1310
+ "realised": "realized",
1311
+ "realises": "realizes",
1312
+ "realising": "realizing",
1313
+ "recognisable": "recognizable",
1314
+ "recognisably": "recognizably",
1315
+ "recognisance": "recognizance",
1316
+ "recognise": "recognize",
1317
+ "recognised": "recognized",
1318
+ "recognises": "recognizes",
1319
+ "recognising": "recognizing",
1320
+ "reconnoitre": "reconnoiter",
1321
+ "reconnoitred": "reconnoitered",
1322
+ "reconnoitres": "reconnoiters",
1323
+ "reconnoitring": "reconnoitering",
1324
+ "refuelled": "refueled",
1325
+ "refuelling": "refueling",
1326
+ "regularisation": "regularization",
1327
+ "regularise": "regularize",
1328
+ "regularised": "regularized",
1329
+ "regularises": "regularizes",
1330
+ "regularising": "regularizing",
1331
+ "remodelled": "remodeled",
1332
+ "remodelling": "remodeling",
1333
+ "remould": "remold",
1334
+ "remoulded": "remolded",
1335
+ "remoulding": "remolding",
1336
+ "remoulds": "remolds",
1337
+ "reorganisation": "reorganization",
1338
+ "reorganisations": "reorganizations",
1339
+ "reorganise": "reorganize",
1340
+ "reorganised": "reorganized",
1341
+ "reorganises": "reorganizes",
1342
+ "reorganising": "reorganizing",
1343
+ "revelled": "reveled",
1344
+ "reveller": "reveler",
1345
+ "revellers": "revelers",
1346
+ "revelling": "reveling",
1347
+ "revitalise": "revitalize",
1348
+ "revitalised": "revitalized",
1349
+ "revitalises": "revitalizes",
1350
+ "revitalising": "revitalizing",
1351
+ "revolutionise": "revolutionize",
1352
+ "revolutionised": "revolutionized",
1353
+ "revolutionises": "revolutionizes",
1354
+ "revolutionising": "revolutionizing",
1355
+ "rhapsodise": "rhapsodize",
1356
+ "rhapsodised": "rhapsodized",
1357
+ "rhapsodises": "rhapsodizes",
1358
+ "rhapsodising": "rhapsodizing",
1359
+ "rigour": "rigor",
1360
+ "rigours": "rigors",
1361
+ "ritualised": "ritualized",
1362
+ "rivalled": "rivaled",
1363
+ "rivalling": "rivaling",
1364
+ "romanticise": "romanticize",
1365
+ "romanticised": "romanticized",
1366
+ "romanticises": "romanticizes",
1367
+ "romanticising": "romanticizing",
1368
+ "rumour": "rumor",
1369
+ "rumoured": "rumored",
1370
+ "rumours": "rumors",
1371
+ "sabre": "saber",
1372
+ "sabres": "sabers",
1373
+ "saltpetre": "saltpeter",
1374
+ "sanitise": "sanitize",
1375
+ "sanitised": "sanitized",
1376
+ "sanitises": "sanitizes",
1377
+ "sanitising": "sanitizing",
1378
+ "satirise": "satirize",
1379
+ "satirised": "satirized",
1380
+ "satirises": "satirizes",
1381
+ "satirising": "satirizing",
1382
+ "saviour": "savior",
1383
+ "saviours": "saviors",
1384
+ "savour": "savor",
1385
+ "savoured": "savored",
1386
+ "savouries": "savories",
1387
+ "savouring": "savoring",
1388
+ "savours": "savors",
1389
+ "savoury": "savory",
1390
+ "scandalise": "scandalize",
1391
+ "scandalised": "scandalized",
1392
+ "scandalises": "scandalizes",
1393
+ "scandalising": "scandalizing",
1394
+ "sceptic": "skeptic",
1395
+ "sceptical": "skeptical",
1396
+ "sceptically": "skeptically",
1397
+ "scepticism": "skepticism",
1398
+ "sceptics": "skeptics",
1399
+ "sceptre": "scepter",
1400
+ "sceptres": "scepters",
1401
+ "scrutinise": "scrutinize",
1402
+ "scrutinised": "scrutinized",
1403
+ "scrutinises": "scrutinizes",
1404
+ "scrutinising": "scrutinizing",
1405
+ "secularisation": "secularization",
1406
+ "secularise": "secularize",
1407
+ "secularised": "secularized",
1408
+ "secularises": "secularizes",
1409
+ "secularising": "secularizing",
1410
+ "sensationalise": "sensationalize",
1411
+ "sensationalised": "sensationalized",
1412
+ "sensationalises": "sensationalizes",
1413
+ "sensationalising": "sensationalizing",
1414
+ "sensitise": "sensitize",
1415
+ "sensitised": "sensitized",
1416
+ "sensitises": "sensitizes",
1417
+ "sensitising": "sensitizing",
1418
+ "sentimentalise": "sentimentalize",
1419
+ "sentimentalised": "sentimentalized",
1420
+ "sentimentalises": "sentimentalizes",
1421
+ "sentimentalising": "sentimentalizing",
1422
+ "sepulchre": "sepulcher",
1423
+ "sepulchres": "sepulchers",
1424
+ "serialisation": "serialization",
1425
+ "serialisations": "serializations",
1426
+ "serialise": "serialize",
1427
+ "serialised": "serialized",
1428
+ "serialises": "serializes",
1429
+ "serialising": "serializing",
1430
+ "sermonise": "sermonize",
1431
+ "sermonised": "sermonized",
1432
+ "sermonises": "sermonizes",
1433
+ "sermonising": "sermonizing",
1434
+ "sheikh": "sheik",
1435
+ "shovelled": "shoveled",
1436
+ "shovelling": "shoveling",
1437
+ "shrivelled": "shriveled",
1438
+ "shrivelling": "shriveling",
1439
+ "signalise": "signalize",
1440
+ "signalised": "signalized",
1441
+ "signalises": "signalizes",
1442
+ "signalising": "signalizing",
1443
+ "signalled": "signaled",
1444
+ "signalling": "signaling",
1445
+ "smoulder": "smolder",
1446
+ "smouldered": "smoldered",
1447
+ "smouldering": "smoldering",
1448
+ "smoulders": "smolders",
1449
+ "snivelled": "sniveled",
1450
+ "snivelling": "sniveling",
1451
+ "snorkelled": "snorkeled",
1452
+ "snorkelling": "snorkeling",
1453
+ "snowplough": "snowplow",
1454
+ "snowploughs": "snowplow",
1455
+ "socialisation": "socialization",
1456
+ "socialise": "socialize",
1457
+ "socialised": "socialized",
1458
+ "socialises": "socializes",
1459
+ "socialising": "socializing",
1460
+ "sodomise": "sodomize",
1461
+ "sodomised": "sodomized",
1462
+ "sodomises": "sodomizes",
1463
+ "sodomising": "sodomizing",
1464
+ "solemnise": "solemnize",
1465
+ "solemnised": "solemnized",
1466
+ "solemnises": "solemnizes",
1467
+ "solemnising": "solemnizing",
1468
+ "sombre": "somber",
1469
+ "specialisation": "specialization",
1470
+ "specialisations": "specializations",
1471
+ "specialise": "specialize",
1472
+ "specialised": "specialized",
1473
+ "specialises": "specializes",
1474
+ "specialising": "specializing",
1475
+ "spectre": "specter",
1476
+ "spectres": "specters",
1477
+ "spiralled": "spiraled",
1478
+ "spiralling": "spiraling",
1479
+ "splendour": "splendor",
1480
+ "splendours": "splendors",
1481
+ "squirrelled": "squirreled",
1482
+ "squirrelling": "squirreling",
1483
+ "stabilisation": "stabilization",
1484
+ "stabilise": "stabilize",
1485
+ "stabilised": "stabilized",
1486
+ "stabiliser": "stabilizer",
1487
+ "stabilisers": "stabilizers",
1488
+ "stabilises": "stabilizes",
1489
+ "stabilising": "stabilizing",
1490
+ "standardisation": "standardization",
1491
+ "standardise": "standardize",
1492
+ "standardised": "standardized",
1493
+ "standardises": "standardizes",
1494
+ "standardising": "standardizing",
1495
+ "stencilled": "stenciled",
1496
+ "stencilling": "stenciling",
1497
+ "sterilisation": "sterilization",
1498
+ "sterilisations": "sterilizations",
1499
+ "sterilise": "sterilize",
1500
+ "sterilised": "sterilized",
1501
+ "steriliser": "sterilizer",
1502
+ "sterilisers": "sterilizers",
1503
+ "sterilises": "sterilizes",
1504
+ "sterilising": "sterilizing",
1505
+ "stigmatisation": "stigmatization",
1506
+ "stigmatise": "stigmatize",
1507
+ "stigmatised": "stigmatized",
1508
+ "stigmatises": "stigmatizes",
1509
+ "stigmatising": "stigmatizing",
1510
+ "storey": "story",
1511
+ "storeys": "stories",
1512
+ "subsidisation": "subsidization",
1513
+ "subsidise": "subsidize",
1514
+ "subsidised": "subsidized",
1515
+ "subsidiser": "subsidizer",
1516
+ "subsidisers": "subsidizers",
1517
+ "subsidises": "subsidizes",
1518
+ "subsidising": "subsidizing",
1519
+ "succour": "succor",
1520
+ "succoured": "succored",
1521
+ "succouring": "succoring",
1522
+ "succours": "succors",
1523
+ "sulphate": "sulfate",
1524
+ "sulphates": "sulfates",
1525
+ "sulphide": "sulfide",
1526
+ "sulphides": "sulfides",
1527
+ "sulphur": "sulfur",
1528
+ "sulphurous": "sulfurous",
1529
+ "summarise": "summarize",
1530
+ "summarised": "summarized",
1531
+ "summarises": "summarizes",
1532
+ "summarising": "summarizing",
1533
+ "swivelled": "swiveled",
1534
+ "swivelling": "swiveling",
1535
+ "symbolise": "symbolize",
1536
+ "symbolised": "symbolized",
1537
+ "symbolises": "symbolizes",
1538
+ "symbolising": "symbolizing",
1539
+ "sympathise": "sympathize",
1540
+ "sympathised": "sympathized",
1541
+ "sympathiser": "sympathizer",
1542
+ "sympathisers": "sympathizers",
1543
+ "sympathises": "sympathizes",
1544
+ "sympathising": "sympathizing",
1545
+ "synchronisation": "synchronization",
1546
+ "synchronise": "synchronize",
1547
+ "synchronised": "synchronized",
1548
+ "synchronises": "synchronizes",
1549
+ "synchronising": "synchronizing",
1550
+ "synthesise": "synthesize",
1551
+ "synthesised": "synthesized",
1552
+ "synthesiser": "synthesizer",
1553
+ "synthesisers": "synthesizers",
1554
+ "synthesises": "synthesizes",
1555
+ "synthesising": "synthesizing",
1556
+ "syphon": "siphon",
1557
+ "syphoned": "siphoned",
1558
+ "syphoning": "siphoning",
1559
+ "syphons": "siphons",
1560
+ "systematisation": "systematization",
1561
+ "systematise": "systematize",
1562
+ "systematised": "systematized",
1563
+ "systematises": "systematizes",
1564
+ "systematising": "systematizing",
1565
+ "tantalise": "tantalize",
1566
+ "tantalised": "tantalized",
1567
+ "tantalises": "tantalizes",
1568
+ "tantalising": "tantalizing",
1569
+ "tantalisingly": "tantalizingly",
1570
+ "tasselled": "tasseled",
1571
+ "technicolour": "technicolor",
1572
+ "temporise": "temporize",
1573
+ "temporised": "temporized",
1574
+ "temporises": "temporizes",
1575
+ "temporising": "temporizing",
1576
+ "tenderise": "tenderize",
1577
+ "tenderised": "tenderized",
1578
+ "tenderises": "tenderizes",
1579
+ "tenderising": "tenderizing",
1580
+ "terrorise": "terrorize",
1581
+ "terrorised": "terrorized",
1582
+ "terrorises": "terrorizes",
1583
+ "terrorising": "terrorizing",
1584
+ "theatre": "theater",
1585
+ "theatregoer": "theatergoer",
1586
+ "theatregoers": "theatergoers",
1587
+ "theatres": "theaters",
1588
+ "theorise": "theorize",
1589
+ "theorised": "theorized",
1590
+ "theorises": "theorizes",
1591
+ "theorising": "theorizing",
1592
+ "tonne": "ton",
1593
+ "tonnes": "tons",
1594
+ "towelled": "toweled",
1595
+ "towelling": "toweling",
1596
+ "toxaemia": "toxemia",
1597
+ "tranquillise": "tranquilize",
1598
+ "tranquillised": "tranquilized",
1599
+ "tranquilliser": "tranquilizer",
1600
+ "tranquillisers": "tranquilizers",
1601
+ "tranquillises": "tranquilizes",
1602
+ "tranquillising": "tranquilizing",
1603
+ "tranquillity": "tranquility",
1604
+ "tranquillize": "tranquilize",
1605
+ "tranquillized": "tranquilized",
1606
+ "tranquillizer": "tranquilizer",
1607
+ "tranquillizers": "tranquilizers",
1608
+ "tranquillizes": "tranquilizes",
1609
+ "tranquillizing": "tranquilizing",
1610
+ "tranquilly": "tranquility",
1611
+ "transistorised": "transistorized",
1612
+ "traumatise": "traumatize",
1613
+ "traumatised": "traumatized",
1614
+ "traumatises": "traumatizes",
1615
+ "traumatising": "traumatizing",
1616
+ "travelled": "traveled",
1617
+ "traveller": "traveler",
1618
+ "travellers": "travelers",
1619
+ "travelling": "traveling",
1620
+ "travelog": "travelogue",
1621
+ "travelogs": "travelogues",
1622
+ "trialled": "trialed",
1623
+ "trialling": "trialing",
1624
+ "tricolour": "tricolor",
1625
+ "tricolours": "tricolors",
1626
+ "trivialise": "trivialize",
1627
+ "trivialised": "trivialized",
1628
+ "trivialises": "trivializes",
1629
+ "trivialising": "trivializing",
1630
+ "tumour": "tumor",
1631
+ "tumours": "tumors",
1632
+ "tunnelled": "tunneled",
1633
+ "tunnelling": "tunneling",
1634
+ "tyrannise": "tyrannize",
1635
+ "tyrannised": "tyrannized",
1636
+ "tyrannises": "tyrannizes",
1637
+ "tyrannising": "tyrannizing",
1638
+ "tyre": "tire",
1639
+ "tyres": "tires",
1640
+ "unauthorised": "unauthorized",
1641
+ "uncivilised": "uncivilized",
1642
+ "underutilised": "underutilized",
1643
+ "unequalled": "unequaled",
1644
+ "unfavourable": "unfavorable",
1645
+ "unfavourably": "unfavorably",
1646
+ "unionisation": "unionization",
1647
+ "unionise": "unionize",
1648
+ "unionised": "unionized",
1649
+ "unionises": "unionizes",
1650
+ "unionising": "unionizing",
1651
+ "unorganised": "unorganized",
1652
+ "unravelled": "unraveled",
1653
+ "unravelling": "unraveling",
1654
+ "unrecognisable": "unrecognizable",
1655
+ "unrecognised": "unrecognized",
1656
+ "unrivalled": "unrivaled",
1657
+ "unsavoury": "unsavory",
1658
+ "untrammelled": "untrammeled",
1659
+ "urbanisation": "urbanization",
1660
+ "urbanise": "urbanize",
1661
+ "urbanised": "urbanized",
1662
+ "urbanises": "urbanizes",
1663
+ "urbanising": "urbanizing",
1664
+ "utilisable": "utilizable",
1665
+ "utilisation": "utilization",
1666
+ "utilise": "utilize",
1667
+ "utilised": "utilized",
1668
+ "utilises": "utilizes",
1669
+ "utilising": "utilizing",
1670
+ "valour": "valor",
1671
+ "vandalise": "vandalize",
1672
+ "vandalised": "vandalized",
1673
+ "vandalises": "vandalizes",
1674
+ "vandalising": "vandalizing",
1675
+ "vaporisation": "vaporization",
1676
+ "vaporise": "vaporize",
1677
+ "vaporised": "vaporized",
1678
+ "vaporises": "vaporizes",
1679
+ "vaporising": "vaporizing",
1680
+ "vapour": "vapor",
1681
+ "vapours": "vapors",
1682
+ "verbalise": "verbalize",
1683
+ "verbalised": "verbalized",
1684
+ "verbalises": "verbalizes",
1685
+ "verbalising": "verbalizing",
1686
+ "victimisation": "victimization",
1687
+ "victimise": "victimize",
1688
+ "victimised": "victimized",
1689
+ "victimises": "victimizes",
1690
+ "victimising": "victimizing",
1691
+ "videodisc": "videodisk",
1692
+ "videodiscs": "videodisks",
1693
+ "vigour": "vigor",
1694
+ "visualisation": "visualization",
1695
+ "visualisations": "visualizations",
1696
+ "visualise": "visualize",
1697
+ "visualised": "visualized",
1698
+ "visualises": "visualizes",
1699
+ "visualising": "visualizing",
1700
+ "vocalisation": "vocalization",
1701
+ "vocalisations": "vocalizations",
1702
+ "vocalise": "vocalize",
1703
+ "vocalised": "vocalized",
1704
+ "vocalises": "vocalizes",
1705
+ "vocalising": "vocalizing",
1706
+ "vulcanised": "vulcanized",
1707
+ "vulgarisation": "vulgarization",
1708
+ "vulgarise": "vulgarize",
1709
+ "vulgarised": "vulgarized",
1710
+ "vulgarises": "vulgarizes",
1711
+ "vulgarising": "vulgarizing",
1712
+ "waggon": "wagon",
1713
+ "waggons": "wagons",
1714
+ "watercolour": "watercolor",
1715
+ "watercolours": "watercolors",
1716
+ "weaselled": "weaseled",
1717
+ "weaselling": "weaseling",
1718
+ "westernisation": "westernization",
1719
+ "westernise": "westernize",
1720
+ "westernised": "westernized",
1721
+ "westernises": "westernizes",
1722
+ "westernising": "westernizing",
1723
+ "womanise": "womanize",
1724
+ "womanised": "womanized",
1725
+ "womaniser": "womanizer",
1726
+ "womanisers": "womanizers",
1727
+ "womanises": "womanizes",
1728
+ "womanising": "womanizing",
1729
+ "woollen": "woolen",
1730
+ "woollens": "woolens",
1731
+ "woollies": "woolies",
1732
+ "woolly": "wooly",
1733
+ "worshipped": "worshiped",
1734
+ "worshipper": "worshiper",
1735
+ "worshipping": "worshiping",
1736
+ "yodelled": "yodeled",
1737
+ "yodelling": "yodeling",
1738
+ "yoghourt": "yogurt",
1739
+ "yoghourts": "yogurts",
1740
+ "yoghurt": "yogurt",
1741
+ "yoghurts": "yogurts"
1742
+ }
preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 80,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
runs/Dec29_17-27-14_sarpba-desktop/events.out.tfevents.1735497535.sarpba-desktop.15381.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e88a63842b95a1fab30056dc87792da74d544838b47d98e17dfec07d8e1e9448
3
+ size 88422
runs/Dec29_17-27-14_sarpba-desktop/events.out.tfevents.1735548810.sarpba-desktop.15381.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0191b9fa202bf2b80f1155a6b0b66029a6e8e0032d1eb25467a7554180fb1a48
3
+ size 412
special_tokens_map.json ADDED
@@ -0,0 +1,139 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<|endoftext|>",
4
+ "<|startoftranscript|>",
5
+ "<|en|>",
6
+ "<|zh|>",
7
+ "<|de|>",
8
+ "<|es|>",
9
+ "<|ru|>",
10
+ "<|ko|>",
11
+ "<|fr|>",
12
+ "<|ja|>",
13
+ "<|pt|>",
14
+ "<|tr|>",
15
+ "<|pl|>",
16
+ "<|ca|>",
17
+ "<|nl|>",
18
+ "<|ar|>",
19
+ "<|sv|>",
20
+ "<|it|>",
21
+ "<|id|>",
22
+ "<|hi|>",
23
+ "<|fi|>",
24
+ "<|vi|>",
25
+ "<|he|>",
26
+ "<|uk|>",
27
+ "<|el|>",
28
+ "<|ms|>",
29
+ "<|cs|>",
30
+ "<|ro|>",
31
+ "<|da|>",
32
+ "<|hu|>",
33
+ "<|ta|>",
34
+ "<|no|>",
35
+ "<|th|>",
36
+ "<|ur|>",
37
+ "<|hr|>",
38
+ "<|bg|>",
39
+ "<|lt|>",
40
+ "<|la|>",
41
+ "<|mi|>",
42
+ "<|ml|>",
43
+ "<|cy|>",
44
+ "<|sk|>",
45
+ "<|te|>",
46
+ "<|fa|>",
47
+ "<|lv|>",
48
+ "<|bn|>",
49
+ "<|sr|>",
50
+ "<|az|>",
51
+ "<|sl|>",
52
+ "<|kn|>",
53
+ "<|et|>",
54
+ "<|mk|>",
55
+ "<|br|>",
56
+ "<|eu|>",
57
+ "<|is|>",
58
+ "<|hy|>",
59
+ "<|ne|>",
60
+ "<|mn|>",
61
+ "<|bs|>",
62
+ "<|kk|>",
63
+ "<|sq|>",
64
+ "<|sw|>",
65
+ "<|gl|>",
66
+ "<|mr|>",
67
+ "<|pa|>",
68
+ "<|si|>",
69
+ "<|km|>",
70
+ "<|sn|>",
71
+ "<|yo|>",
72
+ "<|so|>",
73
+ "<|af|>",
74
+ "<|oc|>",
75
+ "<|ka|>",
76
+ "<|be|>",
77
+ "<|tg|>",
78
+ "<|sd|>",
79
+ "<|gu|>",
80
+ "<|am|>",
81
+ "<|yi|>",
82
+ "<|lo|>",
83
+ "<|uz|>",
84
+ "<|fo|>",
85
+ "<|ht|>",
86
+ "<|ps|>",
87
+ "<|tk|>",
88
+ "<|nn|>",
89
+ "<|mt|>",
90
+ "<|sa|>",
91
+ "<|lb|>",
92
+ "<|my|>",
93
+ "<|bo|>",
94
+ "<|tl|>",
95
+ "<|mg|>",
96
+ "<|as|>",
97
+ "<|tt|>",
98
+ "<|haw|>",
99
+ "<|ln|>",
100
+ "<|ha|>",
101
+ "<|ba|>",
102
+ "<|jw|>",
103
+ "<|su|>",
104
+ "<|translate|>",
105
+ "<|transcribe|>",
106
+ "<|startoflm|>",
107
+ "<|startofprev|>",
108
+ "<|nocaptions|>",
109
+ "<|notimestamps|>"
110
+ ],
111
+ "bos_token": {
112
+ "content": "<|endoftext|>",
113
+ "lstrip": false,
114
+ "normalized": false,
115
+ "rstrip": false,
116
+ "single_word": false
117
+ },
118
+ "eos_token": {
119
+ "content": "<|endoftext|>",
120
+ "lstrip": false,
121
+ "normalized": false,
122
+ "rstrip": false,
123
+ "single_word": false
124
+ },
125
+ "pad_token": {
126
+ "content": "<|endoftext|>",
127
+ "lstrip": false,
128
+ "normalized": false,
129
+ "rstrip": false,
130
+ "single_word": false
131
+ },
132
+ "unk_token": {
133
+ "content": "<|endoftext|>",
134
+ "lstrip": false,
135
+ "normalized": false,
136
+ "rstrip": false,
137
+ "single_word": false
138
+ }
139
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
The diff for this file is too large to render. See raw diff
 
train_results.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 2.999774113395076,
3
+ "total_flos": 2.756290459511145e+20,
4
+ "train_loss": 0.20740618139383307,
5
+ "train_runtime": 51184.2268,
6
+ "train_samples": 1416604,
7
+ "train_samples_per_second": 83.03,
8
+ "train_steps_per_second": 0.649
9
+ }
trainer_state.json ADDED
@@ -0,0 +1,2663 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 2.999774113395076,
5
+ "eval_steps": 1000,
6
+ "global_step": 33201,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.00903546419697312,
13
+ "grad_norm": 7.14754056930542,
14
+ "learning_rate": 6.8599999999999995e-06,
15
+ "loss": 1.8388,
16
+ "step": 100
17
+ },
18
+ {
19
+ "epoch": 0.01807092839394624,
20
+ "grad_norm": 6.974137783050537,
21
+ "learning_rate": 1.386e-05,
22
+ "loss": 1.2229,
23
+ "step": 200
24
+ },
25
+ {
26
+ "epoch": 0.02710639259091936,
27
+ "grad_norm": 6.240023612976074,
28
+ "learning_rate": 2.0859999999999997e-05,
29
+ "loss": 0.9997,
30
+ "step": 300
31
+ },
32
+ {
33
+ "epoch": 0.03614185678789248,
34
+ "grad_norm": 5.526381492614746,
35
+ "learning_rate": 2.7859999999999998e-05,
36
+ "loss": 0.8603,
37
+ "step": 400
38
+ },
39
+ {
40
+ "epoch": 0.0451773209848656,
41
+ "grad_norm": 6.0143938064575195,
42
+ "learning_rate": 3.4859999999999995e-05,
43
+ "loss": 0.7584,
44
+ "step": 500
45
+ },
46
+ {
47
+ "epoch": 0.05421278518183872,
48
+ "grad_norm": 5.181371212005615,
49
+ "learning_rate": 4.1859999999999996e-05,
50
+ "loss": 0.6841,
51
+ "step": 600
52
+ },
53
+ {
54
+ "epoch": 0.06324824937881184,
55
+ "grad_norm": 4.984240531921387,
56
+ "learning_rate": 4.885999999999999e-05,
57
+ "loss": 0.644,
58
+ "step": 700
59
+ },
60
+ {
61
+ "epoch": 0.07228371357578496,
62
+ "grad_norm": 5.0428266525268555,
63
+ "learning_rate": 5.586e-05,
64
+ "loss": 0.6025,
65
+ "step": 800
66
+ },
67
+ {
68
+ "epoch": 0.08131917777275807,
69
+ "grad_norm": 4.736971378326416,
70
+ "learning_rate": 6.285999999999999e-05,
71
+ "loss": 0.5776,
72
+ "step": 900
73
+ },
74
+ {
75
+ "epoch": 0.0903546419697312,
76
+ "grad_norm": 4.559544086456299,
77
+ "learning_rate": 6.986e-05,
78
+ "loss": 0.551,
79
+ "step": 1000
80
+ },
81
+ {
82
+ "epoch": 0.0903546419697312,
83
+ "eval_loss": 0.2710006833076477,
84
+ "eval_runtime": 89.4841,
85
+ "eval_samples_per_second": 47.64,
86
+ "eval_steps_per_second": 0.749,
87
+ "eval_wer": 0.26942939113802994,
88
+ "step": 1000
89
+ },
90
+ {
91
+ "epoch": 0.09939010616670431,
92
+ "grad_norm": 4.705749988555908,
93
+ "learning_rate": 6.978696313779074e-05,
94
+ "loss": 0.5356,
95
+ "step": 1100
96
+ },
97
+ {
98
+ "epoch": 0.10842557036367743,
99
+ "grad_norm": 4.287839412689209,
100
+ "learning_rate": 6.9569578584516e-05,
101
+ "loss": 0.5058,
102
+ "step": 1200
103
+ },
104
+ {
105
+ "epoch": 0.11746103456065056,
106
+ "grad_norm": 3.9484827518463135,
107
+ "learning_rate": 6.935219403124125e-05,
108
+ "loss": 0.4863,
109
+ "step": 1300
110
+ },
111
+ {
112
+ "epoch": 0.12649649875762367,
113
+ "grad_norm": 4.207424640655518,
114
+ "learning_rate": 6.913480947796651e-05,
115
+ "loss": 0.468,
116
+ "step": 1400
117
+ },
118
+ {
119
+ "epoch": 0.1355319629545968,
120
+ "grad_norm": 4.078378200531006,
121
+ "learning_rate": 6.891742492469178e-05,
122
+ "loss": 0.4522,
123
+ "step": 1500
124
+ },
125
+ {
126
+ "epoch": 0.14456742715156992,
127
+ "grad_norm": 3.6946797370910645,
128
+ "learning_rate": 6.870004037141703e-05,
129
+ "loss": 0.4396,
130
+ "step": 1600
131
+ },
132
+ {
133
+ "epoch": 0.15360289134854302,
134
+ "grad_norm": 3.742530345916748,
135
+ "learning_rate": 6.848265581814229e-05,
136
+ "loss": 0.4338,
137
+ "step": 1700
138
+ },
139
+ {
140
+ "epoch": 0.16263835554551614,
141
+ "grad_norm": 4.0423078536987305,
142
+ "learning_rate": 6.826527126486755e-05,
143
+ "loss": 0.4232,
144
+ "step": 1800
145
+ },
146
+ {
147
+ "epoch": 0.17167381974248927,
148
+ "grad_norm": 3.7348833084106445,
149
+ "learning_rate": 6.80478867115928e-05,
150
+ "loss": 0.4144,
151
+ "step": 1900
152
+ },
153
+ {
154
+ "epoch": 0.1807092839394624,
155
+ "grad_norm": 3.4496703147888184,
156
+ "learning_rate": 6.783050215831805e-05,
157
+ "loss": 0.4016,
158
+ "step": 2000
159
+ },
160
+ {
161
+ "epoch": 0.1807092839394624,
162
+ "eval_loss": 0.20093074440956116,
163
+ "eval_runtime": 89.3016,
164
+ "eval_samples_per_second": 47.737,
165
+ "eval_steps_per_second": 0.75,
166
+ "eval_wer": 0.20614174901710763,
167
+ "step": 2000
168
+ },
169
+ {
170
+ "epoch": 0.18974474813643552,
171
+ "grad_norm": 3.3866732120513916,
172
+ "learning_rate": 6.761311760504332e-05,
173
+ "loss": 0.3858,
174
+ "step": 2100
175
+ },
176
+ {
177
+ "epoch": 0.19878021233340862,
178
+ "grad_norm": 4.071012496948242,
179
+ "learning_rate": 6.739573305176857e-05,
180
+ "loss": 0.3875,
181
+ "step": 2200
182
+ },
183
+ {
184
+ "epoch": 0.20781567653038174,
185
+ "grad_norm": 3.373796224594116,
186
+ "learning_rate": 6.717834849849383e-05,
187
+ "loss": 0.3795,
188
+ "step": 2300
189
+ },
190
+ {
191
+ "epoch": 0.21685114072735487,
192
+ "grad_norm": 3.105025291442871,
193
+ "learning_rate": 6.696096394521908e-05,
194
+ "loss": 0.3787,
195
+ "step": 2400
196
+ },
197
+ {
198
+ "epoch": 0.225886604924328,
199
+ "grad_norm": 3.8723206520080566,
200
+ "learning_rate": 6.674357939194434e-05,
201
+ "loss": 0.3716,
202
+ "step": 2500
203
+ },
204
+ {
205
+ "epoch": 0.23492206912130112,
206
+ "grad_norm": 3.2043449878692627,
207
+ "learning_rate": 6.65261948386696e-05,
208
+ "loss": 0.3662,
209
+ "step": 2600
210
+ },
211
+ {
212
+ "epoch": 0.24395753331827422,
213
+ "grad_norm": 3.2647688388824463,
214
+ "learning_rate": 6.631098413092761e-05,
215
+ "loss": 0.3567,
216
+ "step": 2700
217
+ },
218
+ {
219
+ "epoch": 0.25299299751524734,
220
+ "grad_norm": 3.255851984024048,
221
+ "learning_rate": 6.609359957765287e-05,
222
+ "loss": 0.3541,
223
+ "step": 2800
224
+ },
225
+ {
226
+ "epoch": 0.26202846171222044,
227
+ "grad_norm": 3.103607177734375,
228
+ "learning_rate": 6.587621502437812e-05,
229
+ "loss": 0.3551,
230
+ "step": 2900
231
+ },
232
+ {
233
+ "epoch": 0.2710639259091936,
234
+ "grad_norm": 3.7592177391052246,
235
+ "learning_rate": 6.565883047110337e-05,
236
+ "loss": 0.3449,
237
+ "step": 3000
238
+ },
239
+ {
240
+ "epoch": 0.2710639259091936,
241
+ "eval_loss": 0.17070473730564117,
242
+ "eval_runtime": 88.3474,
243
+ "eval_samples_per_second": 48.253,
244
+ "eval_steps_per_second": 0.758,
245
+ "eval_wer": 0.17702688343427903,
246
+ "step": 3000
247
+ },
248
+ {
249
+ "epoch": 0.2800993901061667,
250
+ "grad_norm": 2.7764692306518555,
251
+ "learning_rate": 6.544144591782863e-05,
252
+ "loss": 0.3477,
253
+ "step": 3100
254
+ },
255
+ {
256
+ "epoch": 0.28913485430313984,
257
+ "grad_norm": 2.980421543121338,
258
+ "learning_rate": 6.522406136455388e-05,
259
+ "loss": 0.3367,
260
+ "step": 3200
261
+ },
262
+ {
263
+ "epoch": 0.29817031850011294,
264
+ "grad_norm": 3.0955636501312256,
265
+ "learning_rate": 6.500667681127915e-05,
266
+ "loss": 0.3347,
267
+ "step": 3300
268
+ },
269
+ {
270
+ "epoch": 0.30720578269708604,
271
+ "grad_norm": 2.942781925201416,
272
+ "learning_rate": 6.47892922580044e-05,
273
+ "loss": 0.3363,
274
+ "step": 3400
275
+ },
276
+ {
277
+ "epoch": 0.3162412468940592,
278
+ "grad_norm": 2.7990803718566895,
279
+ "learning_rate": 6.457190770472966e-05,
280
+ "loss": 0.3324,
281
+ "step": 3500
282
+ },
283
+ {
284
+ "epoch": 0.3252767110910323,
285
+ "grad_norm": 3.0384480953216553,
286
+ "learning_rate": 6.435452315145492e-05,
287
+ "loss": 0.3273,
288
+ "step": 3600
289
+ },
290
+ {
291
+ "epoch": 0.33431217528800544,
292
+ "grad_norm": 2.8415443897247314,
293
+ "learning_rate": 6.413713859818017e-05,
294
+ "loss": 0.3231,
295
+ "step": 3700
296
+ },
297
+ {
298
+ "epoch": 0.34334763948497854,
299
+ "grad_norm": 2.706265687942505,
300
+ "learning_rate": 6.391975404490544e-05,
301
+ "loss": 0.3224,
302
+ "step": 3800
303
+ },
304
+ {
305
+ "epoch": 0.35238310368195164,
306
+ "grad_norm": 2.77278995513916,
307
+ "learning_rate": 6.370236949163069e-05,
308
+ "loss": 0.32,
309
+ "step": 3900
310
+ },
311
+ {
312
+ "epoch": 0.3614185678789248,
313
+ "grad_norm": 2.9242990016937256,
314
+ "learning_rate": 6.348498493835595e-05,
315
+ "loss": 0.3147,
316
+ "step": 4000
317
+ },
318
+ {
319
+ "epoch": 0.3614185678789248,
320
+ "eval_loss": 0.1588164120912552,
321
+ "eval_runtime": 89.3911,
322
+ "eval_samples_per_second": 47.689,
323
+ "eval_steps_per_second": 0.75,
324
+ "eval_wer": 0.1649984061204973,
325
+ "step": 4000
326
+ },
327
+ {
328
+ "epoch": 0.3704540320758979,
329
+ "grad_norm": 3.196282148361206,
330
+ "learning_rate": 6.32676003850812e-05,
331
+ "loss": 0.3112,
332
+ "step": 4100
333
+ },
334
+ {
335
+ "epoch": 0.37948949627287104,
336
+ "grad_norm": 3.880776882171631,
337
+ "learning_rate": 6.305021583180646e-05,
338
+ "loss": 0.3154,
339
+ "step": 4200
340
+ },
341
+ {
342
+ "epoch": 0.38852496046984414,
343
+ "grad_norm": 2.7569668292999268,
344
+ "learning_rate": 6.283283127853171e-05,
345
+ "loss": 0.3108,
346
+ "step": 4300
347
+ },
348
+ {
349
+ "epoch": 0.39756042466681724,
350
+ "grad_norm": 2.951040267944336,
351
+ "learning_rate": 6.261544672525697e-05,
352
+ "loss": 0.3093,
353
+ "step": 4400
354
+ },
355
+ {
356
+ "epoch": 0.4065958888637904,
357
+ "grad_norm": 2.667750358581543,
358
+ "learning_rate": 6.239806217198222e-05,
359
+ "loss": 0.3082,
360
+ "step": 4500
361
+ },
362
+ {
363
+ "epoch": 0.4156313530607635,
364
+ "grad_norm": 2.872540235519409,
365
+ "learning_rate": 6.218067761870749e-05,
366
+ "loss": 0.3005,
367
+ "step": 4600
368
+ },
369
+ {
370
+ "epoch": 0.42466681725773664,
371
+ "grad_norm": 3.15378999710083,
372
+ "learning_rate": 6.196329306543275e-05,
373
+ "loss": 0.2994,
374
+ "step": 4700
375
+ },
376
+ {
377
+ "epoch": 0.43370228145470974,
378
+ "grad_norm": 2.879260301589966,
379
+ "learning_rate": 6.1745908512158e-05,
380
+ "loss": 0.2959,
381
+ "step": 4800
382
+ },
383
+ {
384
+ "epoch": 0.44273774565168283,
385
+ "grad_norm": 2.811612367630005,
386
+ "learning_rate": 6.152852395888326e-05,
387
+ "loss": 0.2974,
388
+ "step": 4900
389
+ },
390
+ {
391
+ "epoch": 0.451773209848656,
392
+ "grad_norm": 2.7307889461517334,
393
+ "learning_rate": 6.131113940560851e-05,
394
+ "loss": 0.2936,
395
+ "step": 5000
396
+ },
397
+ {
398
+ "epoch": 0.451773209848656,
399
+ "eval_loss": 0.1471971571445465,
400
+ "eval_runtime": 88.7501,
401
+ "eval_samples_per_second": 48.034,
402
+ "eval_steps_per_second": 0.755,
403
+ "eval_wer": 0.1551376049304006,
404
+ "step": 5000
405
+ },
406
+ {
407
+ "epoch": 0.4608086740456291,
408
+ "grad_norm": 2.734050750732422,
409
+ "learning_rate": 6.109375485233378e-05,
410
+ "loss": 0.2917,
411
+ "step": 5100
412
+ },
413
+ {
414
+ "epoch": 0.46984413824260224,
415
+ "grad_norm": 2.650491952896118,
416
+ "learning_rate": 6.0876370299059026e-05,
417
+ "loss": 0.2929,
418
+ "step": 5200
419
+ },
420
+ {
421
+ "epoch": 0.47887960243957534,
422
+ "grad_norm": 2.519413709640503,
423
+ "learning_rate": 6.065898574578429e-05,
424
+ "loss": 0.2919,
425
+ "step": 5300
426
+ },
427
+ {
428
+ "epoch": 0.48791506663654843,
429
+ "grad_norm": 2.6014676094055176,
430
+ "learning_rate": 6.0441601192509545e-05,
431
+ "loss": 0.2811,
432
+ "step": 5400
433
+ },
434
+ {
435
+ "epoch": 0.4969505308335216,
436
+ "grad_norm": 2.7325778007507324,
437
+ "learning_rate": 6.02242166392348e-05,
438
+ "loss": 0.2878,
439
+ "step": 5500
440
+ },
441
+ {
442
+ "epoch": 0.5059859950304947,
443
+ "grad_norm": 2.636491298675537,
444
+ "learning_rate": 6.000683208596006e-05,
445
+ "loss": 0.2821,
446
+ "step": 5600
447
+ },
448
+ {
449
+ "epoch": 0.5150214592274678,
450
+ "grad_norm": 2.6922860145568848,
451
+ "learning_rate": 5.9789447532685315e-05,
452
+ "loss": 0.2828,
453
+ "step": 5700
454
+ },
455
+ {
456
+ "epoch": 0.5240569234244409,
457
+ "grad_norm": 2.4657480716705322,
458
+ "learning_rate": 5.957206297941057e-05,
459
+ "loss": 0.2845,
460
+ "step": 5800
461
+ },
462
+ {
463
+ "epoch": 0.5330923876214141,
464
+ "grad_norm": 2.6574530601501465,
465
+ "learning_rate": 5.935467842613583e-05,
466
+ "loss": 0.28,
467
+ "step": 5900
468
+ },
469
+ {
470
+ "epoch": 0.5421278518183872,
471
+ "grad_norm": 2.769786834716797,
472
+ "learning_rate": 5.913729387286109e-05,
473
+ "loss": 0.2758,
474
+ "step": 6000
475
+ },
476
+ {
477
+ "epoch": 0.5421278518183872,
478
+ "eval_loss": 0.1405603438615799,
479
+ "eval_runtime": 90.2531,
480
+ "eval_samples_per_second": 47.234,
481
+ "eval_steps_per_second": 0.742,
482
+ "eval_wer": 0.14793326957815323,
483
+ "step": 6000
484
+ },
485
+ {
486
+ "epoch": 0.5511633160153603,
487
+ "grad_norm": 2.6292548179626465,
488
+ "learning_rate": 5.891990931958634e-05,
489
+ "loss": 0.2744,
490
+ "step": 6100
491
+ },
492
+ {
493
+ "epoch": 0.5601987802123334,
494
+ "grad_norm": 2.536770820617676,
495
+ "learning_rate": 5.87025247663116e-05,
496
+ "loss": 0.2735,
497
+ "step": 6200
498
+ },
499
+ {
500
+ "epoch": 0.5692342444093065,
501
+ "grad_norm": 2.3336434364318848,
502
+ "learning_rate": 5.848514021303685e-05,
503
+ "loss": 0.2764,
504
+ "step": 6300
505
+ },
506
+ {
507
+ "epoch": 0.5782697086062797,
508
+ "grad_norm": 2.677401542663574,
509
+ "learning_rate": 5.8267755659762116e-05,
510
+ "loss": 0.2761,
511
+ "step": 6400
512
+ },
513
+ {
514
+ "epoch": 0.5873051728032528,
515
+ "grad_norm": 2.634038209915161,
516
+ "learning_rate": 5.805037110648737e-05,
517
+ "loss": 0.2694,
518
+ "step": 6500
519
+ },
520
+ {
521
+ "epoch": 0.5963406370002259,
522
+ "grad_norm": 2.643404245376587,
523
+ "learning_rate": 5.783298655321262e-05,
524
+ "loss": 0.263,
525
+ "step": 6600
526
+ },
527
+ {
528
+ "epoch": 0.605376101197199,
529
+ "grad_norm": 2.2921056747436523,
530
+ "learning_rate": 5.7615601999937885e-05,
531
+ "loss": 0.2738,
532
+ "step": 6700
533
+ },
534
+ {
535
+ "epoch": 0.6144115653941721,
536
+ "grad_norm": 2.398670196533203,
537
+ "learning_rate": 5.739821744666314e-05,
538
+ "loss": 0.2682,
539
+ "step": 6800
540
+ },
541
+ {
542
+ "epoch": 0.6234470295911453,
543
+ "grad_norm": 2.447571277618408,
544
+ "learning_rate": 5.71808328933884e-05,
545
+ "loss": 0.2653,
546
+ "step": 6900
547
+ },
548
+ {
549
+ "epoch": 0.6324824937881184,
550
+ "grad_norm": 2.270413637161255,
551
+ "learning_rate": 5.6963448340113654e-05,
552
+ "loss": 0.2663,
553
+ "step": 7000
554
+ },
555
+ {
556
+ "epoch": 0.6324824937881184,
557
+ "eval_loss": 0.13218513131141663,
558
+ "eval_runtime": 89.0739,
559
+ "eval_samples_per_second": 47.859,
560
+ "eval_steps_per_second": 0.752,
561
+ "eval_wer": 0.13926256508341303,
562
+ "step": 7000
563
+ },
564
+ {
565
+ "epoch": 0.6415179579850915,
566
+ "grad_norm": 2.406534433364868,
567
+ "learning_rate": 5.674606378683892e-05,
568
+ "loss": 0.2701,
569
+ "step": 7100
570
+ },
571
+ {
572
+ "epoch": 0.6505534221820646,
573
+ "grad_norm": 2.3954741954803467,
574
+ "learning_rate": 5.652867923356417e-05,
575
+ "loss": 0.2661,
576
+ "step": 7200
577
+ },
578
+ {
579
+ "epoch": 0.6595888863790377,
580
+ "grad_norm": 2.3920400142669678,
581
+ "learning_rate": 5.631129468028943e-05,
582
+ "loss": 0.2662,
583
+ "step": 7300
584
+ },
585
+ {
586
+ "epoch": 0.6686243505760109,
587
+ "grad_norm": 2.6168298721313477,
588
+ "learning_rate": 5.6096083972547435e-05,
589
+ "loss": 0.259,
590
+ "step": 7400
591
+ },
592
+ {
593
+ "epoch": 0.677659814772984,
594
+ "grad_norm": 2.351517915725708,
595
+ "learning_rate": 5.587869941927269e-05,
596
+ "loss": 0.2531,
597
+ "step": 7500
598
+ },
599
+ {
600
+ "epoch": 0.6866952789699571,
601
+ "grad_norm": 2.4925589561462402,
602
+ "learning_rate": 5.566131486599794e-05,
603
+ "loss": 0.2584,
604
+ "step": 7600
605
+ },
606
+ {
607
+ "epoch": 0.6957307431669302,
608
+ "grad_norm": 2.465437650680542,
609
+ "learning_rate": 5.5443930312723204e-05,
610
+ "loss": 0.2572,
611
+ "step": 7700
612
+ },
613
+ {
614
+ "epoch": 0.7047662073639033,
615
+ "grad_norm": 2.383103370666504,
616
+ "learning_rate": 5.522654575944846e-05,
617
+ "loss": 0.2541,
618
+ "step": 7800
619
+ },
620
+ {
621
+ "epoch": 0.7138016715608765,
622
+ "grad_norm": 2.254746675491333,
623
+ "learning_rate": 5.5009161206173716e-05,
624
+ "loss": 0.2551,
625
+ "step": 7900
626
+ },
627
+ {
628
+ "epoch": 0.7228371357578496,
629
+ "grad_norm": 2.601073980331421,
630
+ "learning_rate": 5.479177665289897e-05,
631
+ "loss": 0.2613,
632
+ "step": 8000
633
+ },
634
+ {
635
+ "epoch": 0.7228371357578496,
636
+ "eval_loss": 0.1282639354467392,
637
+ "eval_runtime": 89.4564,
638
+ "eval_samples_per_second": 47.654,
639
+ "eval_steps_per_second": 0.749,
640
+ "eval_wer": 0.1401763893316332,
641
+ "step": 8000
642
+ },
643
+ {
644
+ "epoch": 0.7318725999548227,
645
+ "grad_norm": 2.6043508052825928,
646
+ "learning_rate": 5.4574392099624236e-05,
647
+ "loss": 0.2527,
648
+ "step": 8100
649
+ },
650
+ {
651
+ "epoch": 0.7409080641517958,
652
+ "grad_norm": 2.4817826747894287,
653
+ "learning_rate": 5.4357007546349486e-05,
654
+ "loss": 0.2531,
655
+ "step": 8200
656
+ },
657
+ {
658
+ "epoch": 0.7499435283487689,
659
+ "grad_norm": 2.2043120861053467,
660
+ "learning_rate": 5.413962299307475e-05,
661
+ "loss": 0.2508,
662
+ "step": 8300
663
+ },
664
+ {
665
+ "epoch": 0.7589789925457421,
666
+ "grad_norm": 2.436621904373169,
667
+ "learning_rate": 5.39222384398e-05,
668
+ "loss": 0.2524,
669
+ "step": 8400
670
+ },
671
+ {
672
+ "epoch": 0.7680144567427152,
673
+ "grad_norm": 2.2948272228240967,
674
+ "learning_rate": 5.3704853886525255e-05,
675
+ "loss": 0.2511,
676
+ "step": 8500
677
+ },
678
+ {
679
+ "epoch": 0.7770499209396883,
680
+ "grad_norm": 2.516068935394287,
681
+ "learning_rate": 5.348746933325052e-05,
682
+ "loss": 0.2503,
683
+ "step": 8600
684
+ },
685
+ {
686
+ "epoch": 0.7860853851366614,
687
+ "grad_norm": 2.286062002182007,
688
+ "learning_rate": 5.327008477997577e-05,
689
+ "loss": 0.249,
690
+ "step": 8700
691
+ },
692
+ {
693
+ "epoch": 0.7951208493336345,
694
+ "grad_norm": 2.2099480628967285,
695
+ "learning_rate": 5.305270022670103e-05,
696
+ "loss": 0.2476,
697
+ "step": 8800
698
+ },
699
+ {
700
+ "epoch": 0.8041563135306077,
701
+ "grad_norm": 2.279094934463501,
702
+ "learning_rate": 5.283531567342629e-05,
703
+ "loss": 0.2477,
704
+ "step": 8900
705
+ },
706
+ {
707
+ "epoch": 0.8131917777275808,
708
+ "grad_norm": 2.5608932971954346,
709
+ "learning_rate": 5.2617931120151544e-05,
710
+ "loss": 0.2491,
711
+ "step": 9000
712
+ },
713
+ {
714
+ "epoch": 0.8131917777275808,
715
+ "eval_loss": 0.12159302085638046,
716
+ "eval_runtime": 88.1859,
717
+ "eval_samples_per_second": 48.341,
718
+ "eval_steps_per_second": 0.76,
719
+ "eval_wer": 0.1319094676442461,
720
+ "step": 9000
721
+ },
722
+ {
723
+ "epoch": 0.8222272419245539,
724
+ "grad_norm": 2.8134467601776123,
725
+ "learning_rate": 5.24005465668768e-05,
726
+ "loss": 0.2393,
727
+ "step": 9100
728
+ },
729
+ {
730
+ "epoch": 0.831262706121527,
731
+ "grad_norm": 2.109177589416504,
732
+ "learning_rate": 5.218316201360206e-05,
733
+ "loss": 0.247,
734
+ "step": 9200
735
+ },
736
+ {
737
+ "epoch": 0.8402981703185001,
738
+ "grad_norm": 2.333599090576172,
739
+ "learning_rate": 5.196577746032731e-05,
740
+ "loss": 0.2396,
741
+ "step": 9300
742
+ },
743
+ {
744
+ "epoch": 0.8493336345154733,
745
+ "grad_norm": 2.263291120529175,
746
+ "learning_rate": 5.174839290705257e-05,
747
+ "loss": 0.2454,
748
+ "step": 9400
749
+ },
750
+ {
751
+ "epoch": 0.8583690987124464,
752
+ "grad_norm": 2.1932239532470703,
753
+ "learning_rate": 5.153100835377783e-05,
754
+ "loss": 0.2441,
755
+ "step": 9500
756
+ },
757
+ {
758
+ "epoch": 0.8674045629094195,
759
+ "grad_norm": 2.3545312881469727,
760
+ "learning_rate": 5.131362380050308e-05,
761
+ "loss": 0.2388,
762
+ "step": 9600
763
+ },
764
+ {
765
+ "epoch": 0.8764400271063926,
766
+ "grad_norm": 1.9302074909210205,
767
+ "learning_rate": 5.1096239247228345e-05,
768
+ "loss": 0.2386,
769
+ "step": 9700
770
+ },
771
+ {
772
+ "epoch": 0.8854754913033657,
773
+ "grad_norm": 2.2227907180786133,
774
+ "learning_rate": 5.0878854693953595e-05,
775
+ "loss": 0.245,
776
+ "step": 9800
777
+ },
778
+ {
779
+ "epoch": 0.8945109555003389,
780
+ "grad_norm": 2.0656354427337646,
781
+ "learning_rate": 5.066147014067886e-05,
782
+ "loss": 0.2341,
783
+ "step": 9900
784
+ },
785
+ {
786
+ "epoch": 0.903546419697312,
787
+ "grad_norm": 2.062394142150879,
788
+ "learning_rate": 5.0444085587404114e-05,
789
+ "loss": 0.238,
790
+ "step": 10000
791
+ },
792
+ {
793
+ "epoch": 0.903546419697312,
794
+ "eval_loss": 0.11923061311244965,
795
+ "eval_runtime": 88.4115,
796
+ "eval_samples_per_second": 48.218,
797
+ "eval_steps_per_second": 0.758,
798
+ "eval_wer": 0.1290829879927744,
799
+ "step": 10000
800
+ },
801
+ {
802
+ "epoch": 0.9125818838942851,
803
+ "grad_norm": 2.264702081680298,
804
+ "learning_rate": 5.022670103412938e-05,
805
+ "loss": 0.2386,
806
+ "step": 10100
807
+ },
808
+ {
809
+ "epoch": 0.9216173480912582,
810
+ "grad_norm": 2.0281338691711426,
811
+ "learning_rate": 5.000931648085463e-05,
812
+ "loss": 0.2374,
813
+ "step": 10200
814
+ },
815
+ {
816
+ "epoch": 0.9306528122882313,
817
+ "grad_norm": 2.0940310955047607,
818
+ "learning_rate": 4.9791931927579883e-05,
819
+ "loss": 0.2349,
820
+ "step": 10300
821
+ },
822
+ {
823
+ "epoch": 0.9396882764852045,
824
+ "grad_norm": 2.1335864067077637,
825
+ "learning_rate": 4.957454737430514e-05,
826
+ "loss": 0.2326,
827
+ "step": 10400
828
+ },
829
+ {
830
+ "epoch": 0.9487237406821776,
831
+ "grad_norm": 2.3644163608551025,
832
+ "learning_rate": 4.9357162821030396e-05,
833
+ "loss": 0.2314,
834
+ "step": 10500
835
+ },
836
+ {
837
+ "epoch": 0.9577592048791507,
838
+ "grad_norm": 2.029175043106079,
839
+ "learning_rate": 4.91419521132884e-05,
840
+ "loss": 0.2363,
841
+ "step": 10600
842
+ },
843
+ {
844
+ "epoch": 0.9667946690761238,
845
+ "grad_norm": 2.630101203918457,
846
+ "learning_rate": 4.8924567560013664e-05,
847
+ "loss": 0.2298,
848
+ "step": 10700
849
+ },
850
+ {
851
+ "epoch": 0.9758301332730969,
852
+ "grad_norm": 2.356724500656128,
853
+ "learning_rate": 4.870718300673891e-05,
854
+ "loss": 0.2269,
855
+ "step": 10800
856
+ },
857
+ {
858
+ "epoch": 0.9848655974700701,
859
+ "grad_norm": 2.1543145179748535,
860
+ "learning_rate": 4.8489798453464176e-05,
861
+ "loss": 0.2377,
862
+ "step": 10900
863
+ },
864
+ {
865
+ "epoch": 0.9939010616670432,
866
+ "grad_norm": 2.399824857711792,
867
+ "learning_rate": 4.827241390018943e-05,
868
+ "loss": 0.2287,
869
+ "step": 11000
870
+ },
871
+ {
872
+ "epoch": 0.9939010616670432,
873
+ "eval_loss": 0.11506820470094681,
874
+ "eval_runtime": 89.5431,
875
+ "eval_samples_per_second": 47.608,
876
+ "eval_steps_per_second": 0.748,
877
+ "eval_wer": 0.1275528636701732,
878
+ "step": 11000
879
+ },
880
+ {
881
+ "epoch": 1.0028913485430313,
882
+ "grad_norm": 2.18354868888855,
883
+ "learning_rate": 4.805502934691468e-05,
884
+ "loss": 0.2129,
885
+ "step": 11100
886
+ },
887
+ {
888
+ "epoch": 1.0119268127400045,
889
+ "grad_norm": 2.018084764480591,
890
+ "learning_rate": 4.7837644793639945e-05,
891
+ "loss": 0.1792,
892
+ "step": 11200
893
+ },
894
+ {
895
+ "epoch": 1.0209622769369777,
896
+ "grad_norm": 2.1397042274475098,
897
+ "learning_rate": 4.76202602403652e-05,
898
+ "loss": 0.1794,
899
+ "step": 11300
900
+ },
901
+ {
902
+ "epoch": 1.0299977411339507,
903
+ "grad_norm": 1.925986886024475,
904
+ "learning_rate": 4.740287568709046e-05,
905
+ "loss": 0.1816,
906
+ "step": 11400
907
+ },
908
+ {
909
+ "epoch": 1.039033205330924,
910
+ "grad_norm": 2.0704362392425537,
911
+ "learning_rate": 4.7185491133815715e-05,
912
+ "loss": 0.1767,
913
+ "step": 11500
914
+ },
915
+ {
916
+ "epoch": 1.048068669527897,
917
+ "grad_norm": 1.8338583707809448,
918
+ "learning_rate": 4.696810658054098e-05,
919
+ "loss": 0.1767,
920
+ "step": 11600
921
+ },
922
+ {
923
+ "epoch": 1.05710413372487,
924
+ "grad_norm": 1.9655053615570068,
925
+ "learning_rate": 4.675072202726623e-05,
926
+ "loss": 0.1814,
927
+ "step": 11700
928
+ },
929
+ {
930
+ "epoch": 1.0661395979218433,
931
+ "grad_norm": 1.880100965499878,
932
+ "learning_rate": 4.653333747399149e-05,
933
+ "loss": 0.1786,
934
+ "step": 11800
935
+ },
936
+ {
937
+ "epoch": 1.0751750621188163,
938
+ "grad_norm": 2.52089524269104,
939
+ "learning_rate": 4.631595292071674e-05,
940
+ "loss": 0.1796,
941
+ "step": 11900
942
+ },
943
+ {
944
+ "epoch": 1.0842105263157895,
945
+ "grad_norm": 2.179574728012085,
946
+ "learning_rate": 4.6098568367441997e-05,
947
+ "loss": 0.1798,
948
+ "step": 12000
949
+ },
950
+ {
951
+ "epoch": 1.0842105263157895,
952
+ "eval_loss": 0.11312589794397354,
953
+ "eval_runtime": 89.9356,
954
+ "eval_samples_per_second": 47.401,
955
+ "eval_steps_per_second": 0.745,
956
+ "eval_wer": 0.12343002868983105,
957
+ "step": 12000
958
+ },
959
+ {
960
+ "epoch": 1.0932459905127625,
961
+ "grad_norm": 2.3577959537506104,
962
+ "learning_rate": 4.588118381416726e-05,
963
+ "loss": 0.1834,
964
+ "step": 12100
965
+ },
966
+ {
967
+ "epoch": 1.1022814547097357,
968
+ "grad_norm": 2.2626988887786865,
969
+ "learning_rate": 4.566379926089251e-05,
970
+ "loss": 0.1792,
971
+ "step": 12200
972
+ },
973
+ {
974
+ "epoch": 1.111316918906709,
975
+ "grad_norm": 2.0373926162719727,
976
+ "learning_rate": 4.544641470761777e-05,
977
+ "loss": 0.1773,
978
+ "step": 12300
979
+ },
980
+ {
981
+ "epoch": 1.120352383103682,
982
+ "grad_norm": 1.8774733543395996,
983
+ "learning_rate": 4.522903015434303e-05,
984
+ "loss": 0.1763,
985
+ "step": 12400
986
+ },
987
+ {
988
+ "epoch": 1.1293878473006551,
989
+ "grad_norm": 2.0867061614990234,
990
+ "learning_rate": 4.5011645601068285e-05,
991
+ "loss": 0.1775,
992
+ "step": 12500
993
+ },
994
+ {
995
+ "epoch": 1.138423311497628,
996
+ "grad_norm": 1.822313904762268,
997
+ "learning_rate": 4.479426104779354e-05,
998
+ "loss": 0.182,
999
+ "step": 12600
1000
+ },
1001
+ {
1002
+ "epoch": 1.1474587756946013,
1003
+ "grad_norm": 1.9483801126480103,
1004
+ "learning_rate": 4.4579050340051546e-05,
1005
+ "loss": 0.1801,
1006
+ "step": 12700
1007
+ },
1008
+ {
1009
+ "epoch": 1.1564942398915745,
1010
+ "grad_norm": 1.7819561958312988,
1011
+ "learning_rate": 4.436166578677681e-05,
1012
+ "loss": 0.175,
1013
+ "step": 12800
1014
+ },
1015
+ {
1016
+ "epoch": 1.1655297040885475,
1017
+ "grad_norm": 2.2512149810791016,
1018
+ "learning_rate": 4.414428123350206e-05,
1019
+ "loss": 0.1771,
1020
+ "step": 12900
1021
+ },
1022
+ {
1023
+ "epoch": 1.1745651682855207,
1024
+ "grad_norm": 2.0755016803741455,
1025
+ "learning_rate": 4.3926896680227315e-05,
1026
+ "loss": 0.1791,
1027
+ "step": 13000
1028
+ },
1029
+ {
1030
+ "epoch": 1.1745651682855207,
1031
+ "eval_loss": 0.1113397553563118,
1032
+ "eval_runtime": 89.8896,
1033
+ "eval_samples_per_second": 47.425,
1034
+ "eval_steps_per_second": 0.745,
1035
+ "eval_wer": 0.11858463500159389,
1036
+ "step": 13000
1037
+ },
1038
+ {
1039
+ "epoch": 1.1836006324824937,
1040
+ "grad_norm": 1.8246344327926636,
1041
+ "learning_rate": 4.370951212695258e-05,
1042
+ "loss": 0.1826,
1043
+ "step": 13100
1044
+ },
1045
+ {
1046
+ "epoch": 1.192636096679467,
1047
+ "grad_norm": 2.0341689586639404,
1048
+ "learning_rate": 4.349212757367783e-05,
1049
+ "loss": 0.1795,
1050
+ "step": 13200
1051
+ },
1052
+ {
1053
+ "epoch": 1.2016715608764401,
1054
+ "grad_norm": 1.8964906930923462,
1055
+ "learning_rate": 4.327474302040309e-05,
1056
+ "loss": 0.1777,
1057
+ "step": 13300
1058
+ },
1059
+ {
1060
+ "epoch": 1.210707025073413,
1061
+ "grad_norm": 1.9983662366867065,
1062
+ "learning_rate": 4.305735846712835e-05,
1063
+ "loss": 0.1777,
1064
+ "step": 13400
1065
+ },
1066
+ {
1067
+ "epoch": 1.2197424892703863,
1068
+ "grad_norm": 1.9901524782180786,
1069
+ "learning_rate": 4.2839973913853604e-05,
1070
+ "loss": 0.1745,
1071
+ "step": 13500
1072
+ },
1073
+ {
1074
+ "epoch": 1.2287779534673593,
1075
+ "grad_norm": 2.0231523513793945,
1076
+ "learning_rate": 4.262258936057886e-05,
1077
+ "loss": 0.183,
1078
+ "step": 13600
1079
+ },
1080
+ {
1081
+ "epoch": 1.2378134176643325,
1082
+ "grad_norm": 2.097205877304077,
1083
+ "learning_rate": 4.240520480730412e-05,
1084
+ "loss": 0.1795,
1085
+ "step": 13700
1086
+ },
1087
+ {
1088
+ "epoch": 1.2468488818613057,
1089
+ "grad_norm": 1.8367393016815186,
1090
+ "learning_rate": 4.218782025402937e-05,
1091
+ "loss": 0.1746,
1092
+ "step": 13800
1093
+ },
1094
+ {
1095
+ "epoch": 1.2558843460582787,
1096
+ "grad_norm": 2.2997806072235107,
1097
+ "learning_rate": 4.197043570075463e-05,
1098
+ "loss": 0.1781,
1099
+ "step": 13900
1100
+ },
1101
+ {
1102
+ "epoch": 1.264919810255252,
1103
+ "grad_norm": 1.9972946643829346,
1104
+ "learning_rate": 4.1753051147479886e-05,
1105
+ "loss": 0.1787,
1106
+ "step": 14000
1107
+ },
1108
+ {
1109
+ "epoch": 1.264919810255252,
1110
+ "eval_loss": 0.10852447897195816,
1111
+ "eval_runtime": 88.2121,
1112
+ "eval_samples_per_second": 48.327,
1113
+ "eval_steps_per_second": 0.76,
1114
+ "eval_wer": 0.11862713845499948,
1115
+ "step": 14000
1116
+ },
1117
+ {
1118
+ "epoch": 1.273955274452225,
1119
+ "grad_norm": 1.9734628200531006,
1120
+ "learning_rate": 4.153566659420514e-05,
1121
+ "loss": 0.178,
1122
+ "step": 14100
1123
+ },
1124
+ {
1125
+ "epoch": 1.282990738649198,
1126
+ "grad_norm": 2.0544159412384033,
1127
+ "learning_rate": 4.1318282040930405e-05,
1128
+ "loss": 0.1704,
1129
+ "step": 14200
1130
+ },
1131
+ {
1132
+ "epoch": 1.2920262028461713,
1133
+ "grad_norm": 1.8968679904937744,
1134
+ "learning_rate": 4.1100897487655655e-05,
1135
+ "loss": 0.1772,
1136
+ "step": 14300
1137
+ },
1138
+ {
1139
+ "epoch": 1.3010616670431443,
1140
+ "grad_norm": 1.8103258609771729,
1141
+ "learning_rate": 4.088351293438092e-05,
1142
+ "loss": 0.179,
1143
+ "step": 14400
1144
+ },
1145
+ {
1146
+ "epoch": 1.3100971312401175,
1147
+ "grad_norm": 1.9365414381027222,
1148
+ "learning_rate": 4.0666128381106174e-05,
1149
+ "loss": 0.1775,
1150
+ "step": 14500
1151
+ },
1152
+ {
1153
+ "epoch": 1.3191325954370905,
1154
+ "grad_norm": 1.9121586084365845,
1155
+ "learning_rate": 4.044874382783143e-05,
1156
+ "loss": 0.1772,
1157
+ "step": 14600
1158
+ },
1159
+ {
1160
+ "epoch": 1.3281680596340637,
1161
+ "grad_norm": 2.0764715671539307,
1162
+ "learning_rate": 4.023135927455669e-05,
1163
+ "loss": 0.1719,
1164
+ "step": 14700
1165
+ },
1166
+ {
1167
+ "epoch": 1.337203523831037,
1168
+ "grad_norm": 1.9687429666519165,
1169
+ "learning_rate": 4.0013974721281944e-05,
1170
+ "loss": 0.1735,
1171
+ "step": 14800
1172
+ },
1173
+ {
1174
+ "epoch": 1.34623898802801,
1175
+ "grad_norm": 2.0690395832061768,
1176
+ "learning_rate": 3.97965901680072e-05,
1177
+ "loss": 0.1797,
1178
+ "step": 14900
1179
+ },
1180
+ {
1181
+ "epoch": 1.355274452224983,
1182
+ "grad_norm": 2.121548891067505,
1183
+ "learning_rate": 3.9579205614732456e-05,
1184
+ "loss": 0.1771,
1185
+ "step": 15000
1186
+ },
1187
+ {
1188
+ "epoch": 1.355274452224983,
1189
+ "eval_loss": 0.10677234828472137,
1190
+ "eval_runtime": 88.6946,
1191
+ "eval_samples_per_second": 48.064,
1192
+ "eval_steps_per_second": 0.755,
1193
+ "eval_wer": 0.11541812772287749,
1194
+ "step": 15000
1195
+ },
1196
+ {
1197
+ "epoch": 1.364309916421956,
1198
+ "grad_norm": 2.3323662281036377,
1199
+ "learning_rate": 3.936182106145772e-05,
1200
+ "loss": 0.173,
1201
+ "step": 15100
1202
+ },
1203
+ {
1204
+ "epoch": 1.3733453806189293,
1205
+ "grad_norm": 2.262308359146118,
1206
+ "learning_rate": 3.914443650818297e-05,
1207
+ "loss": 0.1723,
1208
+ "step": 15200
1209
+ },
1210
+ {
1211
+ "epoch": 1.3823808448159025,
1212
+ "grad_norm": 2.0854151248931885,
1213
+ "learning_rate": 3.892705195490823e-05,
1214
+ "loss": 0.1753,
1215
+ "step": 15300
1216
+ },
1217
+ {
1218
+ "epoch": 1.3914163090128755,
1219
+ "grad_norm": 2.0246262550354004,
1220
+ "learning_rate": 3.870966740163348e-05,
1221
+ "loss": 0.1742,
1222
+ "step": 15400
1223
+ },
1224
+ {
1225
+ "epoch": 1.4004517732098487,
1226
+ "grad_norm": 2.0298593044281006,
1227
+ "learning_rate": 3.8492282848358745e-05,
1228
+ "loss": 0.1727,
1229
+ "step": 15500
1230
+ },
1231
+ {
1232
+ "epoch": 1.4094872374068217,
1233
+ "grad_norm": 1.8497194051742554,
1234
+ "learning_rate": 3.8274898295084e-05,
1235
+ "loss": 0.1738,
1236
+ "step": 15600
1237
+ },
1238
+ {
1239
+ "epoch": 1.418522701603795,
1240
+ "grad_norm": 2.052497386932373,
1241
+ "learning_rate": 3.805751374180925e-05,
1242
+ "loss": 0.1719,
1243
+ "step": 15700
1244
+ },
1245
+ {
1246
+ "epoch": 1.427558165800768,
1247
+ "grad_norm": 1.948426604270935,
1248
+ "learning_rate": 3.7840129188534514e-05,
1249
+ "loss": 0.1692,
1250
+ "step": 15800
1251
+ },
1252
+ {
1253
+ "epoch": 1.436593629997741,
1254
+ "grad_norm": 2.078310012817383,
1255
+ "learning_rate": 3.762274463525977e-05,
1256
+ "loss": 0.1736,
1257
+ "step": 15900
1258
+ },
1259
+ {
1260
+ "epoch": 1.4456290941947143,
1261
+ "grad_norm": 1.8413662910461426,
1262
+ "learning_rate": 3.740536008198503e-05,
1263
+ "loss": 0.1728,
1264
+ "step": 16000
1265
+ },
1266
+ {
1267
+ "epoch": 1.4456290941947143,
1268
+ "eval_loss": 0.10456942021846771,
1269
+ "eval_runtime": 88.873,
1270
+ "eval_samples_per_second": 47.967,
1271
+ "eval_steps_per_second": 0.754,
1272
+ "eval_wer": 0.11354797577303156,
1273
+ "step": 16000
1274
+ },
1275
+ {
1276
+ "epoch": 1.4546645583916873,
1277
+ "grad_norm": 1.894006371498108,
1278
+ "learning_rate": 3.7187975528710283e-05,
1279
+ "loss": 0.1737,
1280
+ "step": 16100
1281
+ },
1282
+ {
1283
+ "epoch": 1.4637000225886605,
1284
+ "grad_norm": 2.0090203285217285,
1285
+ "learning_rate": 3.6970590975435547e-05,
1286
+ "loss": 0.1723,
1287
+ "step": 16200
1288
+ },
1289
+ {
1290
+ "epoch": 1.4727354867856337,
1291
+ "grad_norm": 1.896735668182373,
1292
+ "learning_rate": 3.6753206422160796e-05,
1293
+ "loss": 0.1744,
1294
+ "step": 16300
1295
+ },
1296
+ {
1297
+ "epoch": 1.4817709509826067,
1298
+ "grad_norm": 1.9422425031661987,
1299
+ "learning_rate": 3.653582186888606e-05,
1300
+ "loss": 0.1662,
1301
+ "step": 16400
1302
+ },
1303
+ {
1304
+ "epoch": 1.49080641517958,
1305
+ "grad_norm": 2.205997943878174,
1306
+ "learning_rate": 3.6318437315611316e-05,
1307
+ "loss": 0.1726,
1308
+ "step": 16500
1309
+ },
1310
+ {
1311
+ "epoch": 1.4998418793765529,
1312
+ "grad_norm": 2.2248659133911133,
1313
+ "learning_rate": 3.6101052762336565e-05,
1314
+ "loss": 0.1739,
1315
+ "step": 16600
1316
+ },
1317
+ {
1318
+ "epoch": 1.508877343573526,
1319
+ "grad_norm": 1.9154504537582397,
1320
+ "learning_rate": 3.588366820906183e-05,
1321
+ "loss": 0.1751,
1322
+ "step": 16700
1323
+ },
1324
+ {
1325
+ "epoch": 1.5179128077704993,
1326
+ "grad_norm": 3.7510364055633545,
1327
+ "learning_rate": 3.566845750131983e-05,
1328
+ "loss": 0.1691,
1329
+ "step": 16800
1330
+ },
1331
+ {
1332
+ "epoch": 1.5269482719674723,
1333
+ "grad_norm": 1.9326035976409912,
1334
+ "learning_rate": 3.545107294804509e-05,
1335
+ "loss": 0.1736,
1336
+ "step": 16900
1337
+ },
1338
+ {
1339
+ "epoch": 1.5359837361644455,
1340
+ "grad_norm": 2.1534535884857178,
1341
+ "learning_rate": 3.5233688394770345e-05,
1342
+ "loss": 0.1714,
1343
+ "step": 17000
1344
+ },
1345
+ {
1346
+ "epoch": 1.5359837361644455,
1347
+ "eval_loss": 0.10288450121879578,
1348
+ "eval_runtime": 88.7852,
1349
+ "eval_samples_per_second": 48.015,
1350
+ "eval_steps_per_second": 0.755,
1351
+ "eval_wer": 0.11522686218255233,
1352
+ "step": 17000
1353
+ },
1354
+ {
1355
+ "epoch": 1.5450192003614185,
1356
+ "grad_norm": 2.0503385066986084,
1357
+ "learning_rate": 3.50163038414956e-05,
1358
+ "loss": 0.1697,
1359
+ "step": 17100
1360
+ },
1361
+ {
1362
+ "epoch": 1.5540546645583917,
1363
+ "grad_norm": 2.1852426528930664,
1364
+ "learning_rate": 3.479891928822086e-05,
1365
+ "loss": 0.1687,
1366
+ "step": 17200
1367
+ },
1368
+ {
1369
+ "epoch": 1.563090128755365,
1370
+ "grad_norm": 1.9237619638442993,
1371
+ "learning_rate": 3.4581534734946115e-05,
1372
+ "loss": 0.1699,
1373
+ "step": 17300
1374
+ },
1375
+ {
1376
+ "epoch": 1.572125592952338,
1377
+ "grad_norm": 1.9139324426651,
1378
+ "learning_rate": 3.436415018167137e-05,
1379
+ "loss": 0.1721,
1380
+ "step": 17400
1381
+ },
1382
+ {
1383
+ "epoch": 1.581161057149311,
1384
+ "grad_norm": 1.8762294054031372,
1385
+ "learning_rate": 3.414676562839663e-05,
1386
+ "loss": 0.1682,
1387
+ "step": 17500
1388
+ },
1389
+ {
1390
+ "epoch": 1.590196521346284,
1391
+ "grad_norm": 1.6753225326538086,
1392
+ "learning_rate": 3.392938107512189e-05,
1393
+ "loss": 0.1648,
1394
+ "step": 17600
1395
+ },
1396
+ {
1397
+ "epoch": 1.5992319855432573,
1398
+ "grad_norm": 2.4316673278808594,
1399
+ "learning_rate": 3.371199652184715e-05,
1400
+ "loss": 0.1701,
1401
+ "step": 17700
1402
+ },
1403
+ {
1404
+ "epoch": 1.6082674497402305,
1405
+ "grad_norm": 1.9219187498092651,
1406
+ "learning_rate": 3.34946119685724e-05,
1407
+ "loss": 0.1669,
1408
+ "step": 17800
1409
+ },
1410
+ {
1411
+ "epoch": 1.6173029139372035,
1412
+ "grad_norm": 1.6715503931045532,
1413
+ "learning_rate": 3.327722741529766e-05,
1414
+ "loss": 0.1675,
1415
+ "step": 17900
1416
+ },
1417
+ {
1418
+ "epoch": 1.6263383781341767,
1419
+ "grad_norm": 1.9405934810638428,
1420
+ "learning_rate": 3.3059842862022916e-05,
1421
+ "loss": 0.1706,
1422
+ "step": 18000
1423
+ },
1424
+ {
1425
+ "epoch": 1.6263383781341767,
1426
+ "eval_loss": 0.10067987442016602,
1427
+ "eval_runtime": 89.3754,
1428
+ "eval_samples_per_second": 47.698,
1429
+ "eval_steps_per_second": 0.75,
1430
+ "eval_wer": 0.11174157900329401,
1431
+ "step": 18000
1432
+ },
1433
+ {
1434
+ "epoch": 1.6353738423311497,
1435
+ "grad_norm": 2.1481971740722656,
1436
+ "learning_rate": 3.284245830874817e-05,
1437
+ "loss": 0.1668,
1438
+ "step": 18100
1439
+ },
1440
+ {
1441
+ "epoch": 1.644409306528123,
1442
+ "grad_norm": 2.29831600189209,
1443
+ "learning_rate": 3.262507375547343e-05,
1444
+ "loss": 0.1683,
1445
+ "step": 18200
1446
+ },
1447
+ {
1448
+ "epoch": 1.653444770725096,
1449
+ "grad_norm": 1.698500633239746,
1450
+ "learning_rate": 3.2407689202198685e-05,
1451
+ "loss": 0.1651,
1452
+ "step": 18300
1453
+ },
1454
+ {
1455
+ "epoch": 1.662480234922069,
1456
+ "grad_norm": 2.0010197162628174,
1457
+ "learning_rate": 3.219030464892394e-05,
1458
+ "loss": 0.1647,
1459
+ "step": 18400
1460
+ },
1461
+ {
1462
+ "epoch": 1.671515699119042,
1463
+ "grad_norm": 1.8577830791473389,
1464
+ "learning_rate": 3.19729200956492e-05,
1465
+ "loss": 0.1649,
1466
+ "step": 18500
1467
+ },
1468
+ {
1469
+ "epoch": 1.6805511633160153,
1470
+ "grad_norm": 2.0325686931610107,
1471
+ "learning_rate": 3.175553554237446e-05,
1472
+ "loss": 0.1664,
1473
+ "step": 18600
1474
+ },
1475
+ {
1476
+ "epoch": 1.6895866275129885,
1477
+ "grad_norm": 1.8574236631393433,
1478
+ "learning_rate": 3.153815098909972e-05,
1479
+ "loss": 0.1646,
1480
+ "step": 18700
1481
+ },
1482
+ {
1483
+ "epoch": 1.6986220917099617,
1484
+ "grad_norm": 1.94573175907135,
1485
+ "learning_rate": 3.1320766435824974e-05,
1486
+ "loss": 0.1623,
1487
+ "step": 18800
1488
+ },
1489
+ {
1490
+ "epoch": 1.7076575559069347,
1491
+ "grad_norm": 1.9908078908920288,
1492
+ "learning_rate": 3.1103381882550224e-05,
1493
+ "loss": 0.1632,
1494
+ "step": 18900
1495
+ },
1496
+ {
1497
+ "epoch": 1.7166930201039077,
1498
+ "grad_norm": 1.7018805742263794,
1499
+ "learning_rate": 3.088599732927549e-05,
1500
+ "loss": 0.163,
1501
+ "step": 19000
1502
+ },
1503
+ {
1504
+ "epoch": 1.7166930201039077,
1505
+ "eval_loss": 0.09983944892883301,
1506
+ "eval_runtime": 88.5039,
1507
+ "eval_samples_per_second": 48.167,
1508
+ "eval_steps_per_second": 0.757,
1509
+ "eval_wer": 0.10740622675592391,
1510
+ "step": 19000
1511
+ },
1512
+ {
1513
+ "epoch": 1.7257284843008809,
1514
+ "grad_norm": 1.8709958791732788,
1515
+ "learning_rate": 3.066861277600074e-05,
1516
+ "loss": 0.163,
1517
+ "step": 19100
1518
+ },
1519
+ {
1520
+ "epoch": 1.734763948497854,
1521
+ "grad_norm": 2.1051034927368164,
1522
+ "learning_rate": 3.0451228222726e-05,
1523
+ "loss": 0.1632,
1524
+ "step": 19200
1525
+ },
1526
+ {
1527
+ "epoch": 1.7437994126948273,
1528
+ "grad_norm": 2.1160008907318115,
1529
+ "learning_rate": 3.0233843669451256e-05,
1530
+ "loss": 0.1677,
1531
+ "step": 19300
1532
+ },
1533
+ {
1534
+ "epoch": 1.7528348768918003,
1535
+ "grad_norm": 1.7885472774505615,
1536
+ "learning_rate": 3.0016459116176512e-05,
1537
+ "loss": 0.1628,
1538
+ "step": 19400
1539
+ },
1540
+ {
1541
+ "epoch": 1.7618703410887733,
1542
+ "grad_norm": 1.7749061584472656,
1543
+ "learning_rate": 2.9799074562901772e-05,
1544
+ "loss": 0.1623,
1545
+ "step": 19500
1546
+ },
1547
+ {
1548
+ "epoch": 1.7709058052857465,
1549
+ "grad_norm": 1.933435320854187,
1550
+ "learning_rate": 2.958169000962703e-05,
1551
+ "loss": 0.1639,
1552
+ "step": 19600
1553
+ },
1554
+ {
1555
+ "epoch": 1.7799412694827197,
1556
+ "grad_norm": 1.7979782819747925,
1557
+ "learning_rate": 2.9364305456352285e-05,
1558
+ "loss": 0.1581,
1559
+ "step": 19700
1560
+ },
1561
+ {
1562
+ "epoch": 1.788976733679693,
1563
+ "grad_norm": 1.9905706644058228,
1564
+ "learning_rate": 2.914692090307754e-05,
1565
+ "loss": 0.1623,
1566
+ "step": 19800
1567
+ },
1568
+ {
1569
+ "epoch": 1.7980121978766659,
1570
+ "grad_norm": 2.146162271499634,
1571
+ "learning_rate": 2.8929536349802798e-05,
1572
+ "loss": 0.1632,
1573
+ "step": 19900
1574
+ },
1575
+ {
1576
+ "epoch": 1.8070476620736389,
1577
+ "grad_norm": 1.861401081085205,
1578
+ "learning_rate": 2.8712151796528054e-05,
1579
+ "loss": 0.1613,
1580
+ "step": 20000
1581
+ },
1582
+ {
1583
+ "epoch": 1.8070476620736389,
1584
+ "eval_loss": 0.09824151545763016,
1585
+ "eval_runtime": 87.8053,
1586
+ "eval_samples_per_second": 48.551,
1587
+ "eval_steps_per_second": 0.763,
1588
+ "eval_wer": 0.10753373711614068,
1589
+ "step": 20000
1590
+ },
1591
+ {
1592
+ "epoch": 1.816083126270612,
1593
+ "grad_norm": 1.8411866426467896,
1594
+ "learning_rate": 2.849476724325331e-05,
1595
+ "loss": 0.165,
1596
+ "step": 20100
1597
+ },
1598
+ {
1599
+ "epoch": 1.8251185904675853,
1600
+ "grad_norm": 1.7575931549072266,
1601
+ "learning_rate": 2.827738268997857e-05,
1602
+ "loss": 0.1564,
1603
+ "step": 20200
1604
+ },
1605
+ {
1606
+ "epoch": 1.8341540546645585,
1607
+ "grad_norm": 2.028254985809326,
1608
+ "learning_rate": 2.8059998136703827e-05,
1609
+ "loss": 0.1589,
1610
+ "step": 20300
1611
+ },
1612
+ {
1613
+ "epoch": 1.8431895188615315,
1614
+ "grad_norm": 1.9810631275177002,
1615
+ "learning_rate": 2.7842613583429083e-05,
1616
+ "loss": 0.1586,
1617
+ "step": 20400
1618
+ },
1619
+ {
1620
+ "epoch": 1.8522249830585045,
1621
+ "grad_norm": 1.8610142469406128,
1622
+ "learning_rate": 2.7625229030154343e-05,
1623
+ "loss": 0.1602,
1624
+ "step": 20500
1625
+ },
1626
+ {
1627
+ "epoch": 1.8612604472554777,
1628
+ "grad_norm": 1.9897997379302979,
1629
+ "learning_rate": 2.74078444768796e-05,
1630
+ "loss": 0.1625,
1631
+ "step": 20600
1632
+ },
1633
+ {
1634
+ "epoch": 1.8702959114524509,
1635
+ "grad_norm": 1.7494564056396484,
1636
+ "learning_rate": 2.7190459923604856e-05,
1637
+ "loss": 0.1593,
1638
+ "step": 20700
1639
+ },
1640
+ {
1641
+ "epoch": 1.879331375649424,
1642
+ "grad_norm": 1.9486002922058105,
1643
+ "learning_rate": 2.6975249215862856e-05,
1644
+ "loss": 0.1595,
1645
+ "step": 20800
1646
+ },
1647
+ {
1648
+ "epoch": 1.888366839846397,
1649
+ "grad_norm": 1.950518012046814,
1650
+ "learning_rate": 2.6757864662588116e-05,
1651
+ "loss": 0.1619,
1652
+ "step": 20900
1653
+ },
1654
+ {
1655
+ "epoch": 1.89740230404337,
1656
+ "grad_norm": 1.9625803232192993,
1657
+ "learning_rate": 2.6540480109313373e-05,
1658
+ "loss": 0.1568,
1659
+ "step": 21000
1660
+ },
1661
+ {
1662
+ "epoch": 1.89740230404337,
1663
+ "eval_loss": 0.09674616158008575,
1664
+ "eval_runtime": 88.8971,
1665
+ "eval_samples_per_second": 47.954,
1666
+ "eval_steps_per_second": 0.754,
1667
+ "eval_wer": 0.10868133035809159,
1668
+ "step": 21000
1669
+ },
1670
+ {
1671
+ "epoch": 1.9064377682403433,
1672
+ "grad_norm": 1.7447710037231445,
1673
+ "learning_rate": 2.632309555603863e-05,
1674
+ "loss": 0.1566,
1675
+ "step": 21100
1676
+ },
1677
+ {
1678
+ "epoch": 1.9154732324373165,
1679
+ "grad_norm": 2.0597004890441895,
1680
+ "learning_rate": 2.610571100276389e-05,
1681
+ "loss": 0.1594,
1682
+ "step": 21200
1683
+ },
1684
+ {
1685
+ "epoch": 1.9245086966342897,
1686
+ "grad_norm": 2.045921802520752,
1687
+ "learning_rate": 2.5888326449489145e-05,
1688
+ "loss": 0.1592,
1689
+ "step": 21300
1690
+ },
1691
+ {
1692
+ "epoch": 1.9335441608312627,
1693
+ "grad_norm": 1.9995648860931396,
1694
+ "learning_rate": 2.56709418962144e-05,
1695
+ "loss": 0.1591,
1696
+ "step": 21400
1697
+ },
1698
+ {
1699
+ "epoch": 1.9425796250282357,
1700
+ "grad_norm": 1.765527367591858,
1701
+ "learning_rate": 2.5455731188472406e-05,
1702
+ "loss": 0.1578,
1703
+ "step": 21500
1704
+ },
1705
+ {
1706
+ "epoch": 1.9516150892252089,
1707
+ "grad_norm": 1.8758126497268677,
1708
+ "learning_rate": 2.5238346635197665e-05,
1709
+ "loss": 0.1577,
1710
+ "step": 21600
1711
+ },
1712
+ {
1713
+ "epoch": 1.960650553422182,
1714
+ "grad_norm": 1.770780324935913,
1715
+ "learning_rate": 2.502096208192292e-05,
1716
+ "loss": 0.1584,
1717
+ "step": 21700
1718
+ },
1719
+ {
1720
+ "epoch": 1.9696860176191553,
1721
+ "grad_norm": 1.8630551099777222,
1722
+ "learning_rate": 2.4803577528648175e-05,
1723
+ "loss": 0.1548,
1724
+ "step": 21800
1725
+ },
1726
+ {
1727
+ "epoch": 1.9787214818161283,
1728
+ "grad_norm": 1.8517158031463623,
1729
+ "learning_rate": 2.458619297537343e-05,
1730
+ "loss": 0.1593,
1731
+ "step": 21900
1732
+ },
1733
+ {
1734
+ "epoch": 1.9877569460131013,
1735
+ "grad_norm": 1.6973580121994019,
1736
+ "learning_rate": 2.436880842209869e-05,
1737
+ "loss": 0.1525,
1738
+ "step": 22000
1739
+ },
1740
+ {
1741
+ "epoch": 1.9877569460131013,
1742
+ "eval_loss": 0.0945153757929802,
1743
+ "eval_runtime": 87.5175,
1744
+ "eval_samples_per_second": 48.71,
1745
+ "eval_steps_per_second": 0.766,
1746
+ "eval_wer": 0.10449474019764106,
1747
+ "step": 22000
1748
+ },
1749
+ {
1750
+ "epoch": 1.9967924102100745,
1751
+ "grad_norm": 2.0748767852783203,
1752
+ "learning_rate": 2.4151423868823947e-05,
1753
+ "loss": 0.1573,
1754
+ "step": 22100
1755
+ },
1756
+ {
1757
+ "epoch": 2.0057826970860626,
1758
+ "grad_norm": 1.6151518821716309,
1759
+ "learning_rate": 2.3934039315549204e-05,
1760
+ "loss": 0.1241,
1761
+ "step": 22200
1762
+ },
1763
+ {
1764
+ "epoch": 2.014818161283036,
1765
+ "grad_norm": 1.5904980897903442,
1766
+ "learning_rate": 2.3716654762274464e-05,
1767
+ "loss": 0.1074,
1768
+ "step": 22300
1769
+ },
1770
+ {
1771
+ "epoch": 2.023853625480009,
1772
+ "grad_norm": 1.4857326745986938,
1773
+ "learning_rate": 2.349927020899972e-05,
1774
+ "loss": 0.1029,
1775
+ "step": 22400
1776
+ },
1777
+ {
1778
+ "epoch": 2.0328890896769822,
1779
+ "grad_norm": 1.7787961959838867,
1780
+ "learning_rate": 2.3281885655724976e-05,
1781
+ "loss": 0.1066,
1782
+ "step": 22500
1783
+ },
1784
+ {
1785
+ "epoch": 2.0419245538739554,
1786
+ "grad_norm": 1.6591817140579224,
1787
+ "learning_rate": 2.3066674947982977e-05,
1788
+ "loss": 0.1057,
1789
+ "step": 22600
1790
+ },
1791
+ {
1792
+ "epoch": 2.050960018070928,
1793
+ "grad_norm": 1.6939488649368286,
1794
+ "learning_rate": 2.2849290394708237e-05,
1795
+ "loss": 0.1051,
1796
+ "step": 22700
1797
+ },
1798
+ {
1799
+ "epoch": 2.0599954822679014,
1800
+ "grad_norm": 1.5981281995773315,
1801
+ "learning_rate": 2.2631905841433493e-05,
1802
+ "loss": 0.1036,
1803
+ "step": 22800
1804
+ },
1805
+ {
1806
+ "epoch": 2.0690309464648746,
1807
+ "grad_norm": 1.8668162822723389,
1808
+ "learning_rate": 2.241452128815875e-05,
1809
+ "loss": 0.1063,
1810
+ "step": 22900
1811
+ },
1812
+ {
1813
+ "epoch": 2.078066410661848,
1814
+ "grad_norm": 1.627382755279541,
1815
+ "learning_rate": 2.219713673488401e-05,
1816
+ "loss": 0.1063,
1817
+ "step": 23000
1818
+ },
1819
+ {
1820
+ "epoch": 2.078066410661848,
1821
+ "eval_loss": 0.0966850146651268,
1822
+ "eval_runtime": 88.5935,
1823
+ "eval_samples_per_second": 48.119,
1824
+ "eval_steps_per_second": 0.756,
1825
+ "eval_wer": 0.10462225055785783,
1826
+ "step": 23000
1827
+ },
1828
+ {
1829
+ "epoch": 2.087101874858821,
1830
+ "grad_norm": 1.6317180395126343,
1831
+ "learning_rate": 2.1979752181609266e-05,
1832
+ "loss": 0.1067,
1833
+ "step": 23100
1834
+ },
1835
+ {
1836
+ "epoch": 2.096137339055794,
1837
+ "grad_norm": 1.5637694597244263,
1838
+ "learning_rate": 2.1762367628334522e-05,
1839
+ "loss": 0.1061,
1840
+ "step": 23200
1841
+ },
1842
+ {
1843
+ "epoch": 2.105172803252767,
1844
+ "grad_norm": 1.561661720275879,
1845
+ "learning_rate": 2.154498307505978e-05,
1846
+ "loss": 0.1066,
1847
+ "step": 23300
1848
+ },
1849
+ {
1850
+ "epoch": 2.11420826744974,
1851
+ "grad_norm": 1.570977807044983,
1852
+ "learning_rate": 2.132759852178504e-05,
1853
+ "loss": 0.1057,
1854
+ "step": 23400
1855
+ },
1856
+ {
1857
+ "epoch": 2.1232437316467134,
1858
+ "grad_norm": 1.6354864835739136,
1859
+ "learning_rate": 2.111021396851029e-05,
1860
+ "loss": 0.1061,
1861
+ "step": 23500
1862
+ },
1863
+ {
1864
+ "epoch": 2.1322791958436866,
1865
+ "grad_norm": 1.6001309156417847,
1866
+ "learning_rate": 2.0892829415235548e-05,
1867
+ "loss": 0.1038,
1868
+ "step": 23600
1869
+ },
1870
+ {
1871
+ "epoch": 2.1413146600406594,
1872
+ "grad_norm": 1.7492948770523071,
1873
+ "learning_rate": 2.0675444861960808e-05,
1874
+ "loss": 0.1051,
1875
+ "step": 23700
1876
+ },
1877
+ {
1878
+ "epoch": 2.1503501242376326,
1879
+ "grad_norm": 1.7432228326797485,
1880
+ "learning_rate": 2.0458060308686064e-05,
1881
+ "loss": 0.1029,
1882
+ "step": 23800
1883
+ },
1884
+ {
1885
+ "epoch": 2.159385588434606,
1886
+ "grad_norm": 1.5974751710891724,
1887
+ "learning_rate": 2.024067575541132e-05,
1888
+ "loss": 0.1061,
1889
+ "step": 23900
1890
+ },
1891
+ {
1892
+ "epoch": 2.168421052631579,
1893
+ "grad_norm": 1.8045574426651,
1894
+ "learning_rate": 2.0023291202136577e-05,
1895
+ "loss": 0.1075,
1896
+ "step": 24000
1897
+ },
1898
+ {
1899
+ "epoch": 2.168421052631579,
1900
+ "eval_loss": 0.0951407328248024,
1901
+ "eval_runtime": 88.9045,
1902
+ "eval_samples_per_second": 47.95,
1903
+ "eval_steps_per_second": 0.754,
1904
+ "eval_wer": 0.10304962278185102,
1905
+ "step": 24000
1906
+ },
1907
+ {
1908
+ "epoch": 2.1774565168285522,
1909
+ "grad_norm": 1.6032062768936157,
1910
+ "learning_rate": 1.9805906648861836e-05,
1911
+ "loss": 0.1065,
1912
+ "step": 24100
1913
+ },
1914
+ {
1915
+ "epoch": 2.186491981025525,
1916
+ "grad_norm": 1.5442743301391602,
1917
+ "learning_rate": 1.9588522095587093e-05,
1918
+ "loss": 0.1063,
1919
+ "step": 24200
1920
+ },
1921
+ {
1922
+ "epoch": 2.195527445222498,
1923
+ "grad_norm": 1.6346817016601562,
1924
+ "learning_rate": 1.937113754231235e-05,
1925
+ "loss": 0.1036,
1926
+ "step": 24300
1927
+ },
1928
+ {
1929
+ "epoch": 2.2045629094194714,
1930
+ "grad_norm": 1.6535338163375854,
1931
+ "learning_rate": 1.9153752989037606e-05,
1932
+ "loss": 0.1051,
1933
+ "step": 24400
1934
+ },
1935
+ {
1936
+ "epoch": 2.2135983736164446,
1937
+ "grad_norm": 1.6055641174316406,
1938
+ "learning_rate": 1.8936368435762862e-05,
1939
+ "loss": 0.1064,
1940
+ "step": 24500
1941
+ },
1942
+ {
1943
+ "epoch": 2.222633837813418,
1944
+ "grad_norm": 1.936577558517456,
1945
+ "learning_rate": 1.871898388248812e-05,
1946
+ "loss": 0.1045,
1947
+ "step": 24600
1948
+ },
1949
+ {
1950
+ "epoch": 2.2316693020103906,
1951
+ "grad_norm": 1.58518385887146,
1952
+ "learning_rate": 1.8501599329213375e-05,
1953
+ "loss": 0.1071,
1954
+ "step": 24700
1955
+ },
1956
+ {
1957
+ "epoch": 2.240704766207364,
1958
+ "grad_norm": 1.73505437374115,
1959
+ "learning_rate": 1.8284214775938635e-05,
1960
+ "loss": 0.1065,
1961
+ "step": 24800
1962
+ },
1963
+ {
1964
+ "epoch": 2.249740230404337,
1965
+ "grad_norm": 1.7908620834350586,
1966
+ "learning_rate": 1.806683022266389e-05,
1967
+ "loss": 0.1065,
1968
+ "step": 24900
1969
+ },
1970
+ {
1971
+ "epoch": 2.2587756946013102,
1972
+ "grad_norm": 1.654637336730957,
1973
+ "learning_rate": 1.7849445669389147e-05,
1974
+ "loss": 0.1035,
1975
+ "step": 25000
1976
+ },
1977
+ {
1978
+ "epoch": 2.2587756946013102,
1979
+ "eval_loss": 0.09359237551689148,
1980
+ "eval_runtime": 90.6143,
1981
+ "eval_samples_per_second": 47.046,
1982
+ "eval_steps_per_second": 0.739,
1983
+ "eval_wer": 0.10149824673254702,
1984
+ "step": 25000
1985
+ },
1986
+ {
1987
+ "epoch": 2.2678111587982834,
1988
+ "grad_norm": 1.6015100479125977,
1989
+ "learning_rate": 1.7632061116114407e-05,
1990
+ "loss": 0.1062,
1991
+ "step": 25100
1992
+ },
1993
+ {
1994
+ "epoch": 2.276846622995256,
1995
+ "grad_norm": 1.6547913551330566,
1996
+ "learning_rate": 1.741467656283966e-05,
1997
+ "loss": 0.1053,
1998
+ "step": 25200
1999
+ },
2000
+ {
2001
+ "epoch": 2.2858820871922294,
2002
+ "grad_norm": 1.7010306119918823,
2003
+ "learning_rate": 1.719729200956492e-05,
2004
+ "loss": 0.1041,
2005
+ "step": 25300
2006
+ },
2007
+ {
2008
+ "epoch": 2.2949175513892026,
2009
+ "grad_norm": 1.8139252662658691,
2010
+ "learning_rate": 1.6979907456290176e-05,
2011
+ "loss": 0.103,
2012
+ "step": 25400
2013
+ },
2014
+ {
2015
+ "epoch": 2.303953015586176,
2016
+ "grad_norm": 1.6318985223770142,
2017
+ "learning_rate": 1.6762522903015433e-05,
2018
+ "loss": 0.104,
2019
+ "step": 25500
2020
+ },
2021
+ {
2022
+ "epoch": 2.312988479783149,
2023
+ "grad_norm": 1.798727035522461,
2024
+ "learning_rate": 1.654513834974069e-05,
2025
+ "loss": 0.1055,
2026
+ "step": 25600
2027
+ },
2028
+ {
2029
+ "epoch": 2.322023943980122,
2030
+ "grad_norm": 1.527917504310608,
2031
+ "learning_rate": 1.6327753796465945e-05,
2032
+ "loss": 0.106,
2033
+ "step": 25700
2034
+ },
2035
+ {
2036
+ "epoch": 2.331059408177095,
2037
+ "grad_norm": 1.6333855390548706,
2038
+ "learning_rate": 1.6110369243191205e-05,
2039
+ "loss": 0.1024,
2040
+ "step": 25800
2041
+ },
2042
+ {
2043
+ "epoch": 2.340094872374068,
2044
+ "grad_norm": 1.5563682317733765,
2045
+ "learning_rate": 1.589298468991646e-05,
2046
+ "loss": 0.1031,
2047
+ "step": 25900
2048
+ },
2049
+ {
2050
+ "epoch": 2.3491303365710414,
2051
+ "grad_norm": 1.6106479167938232,
2052
+ "learning_rate": 1.5675600136641718e-05,
2053
+ "loss": 0.1056,
2054
+ "step": 26000
2055
+ },
2056
+ {
2057
+ "epoch": 2.3491303365710414,
2058
+ "eval_loss": 0.09276529401540756,
2059
+ "eval_runtime": 88.7242,
2060
+ "eval_samples_per_second": 48.048,
2061
+ "eval_steps_per_second": 0.755,
2062
+ "eval_wer": 0.10132823291892466,
2063
+ "step": 26000
2064
+ },
2065
+ {
2066
+ "epoch": 2.3581658007680146,
2067
+ "grad_norm": 1.8455883264541626,
2068
+ "learning_rate": 1.5458215583366974e-05,
2069
+ "loss": 0.1043,
2070
+ "step": 26100
2071
+ },
2072
+ {
2073
+ "epoch": 2.3672012649649874,
2074
+ "grad_norm": 1.7726097106933594,
2075
+ "learning_rate": 1.5240831030092233e-05,
2076
+ "loss": 0.1015,
2077
+ "step": 26200
2078
+ },
2079
+ {
2080
+ "epoch": 2.3762367291619606,
2081
+ "grad_norm": 1.6910566091537476,
2082
+ "learning_rate": 1.5023446476817489e-05,
2083
+ "loss": 0.1055,
2084
+ "step": 26300
2085
+ },
2086
+ {
2087
+ "epoch": 2.385272193358934,
2088
+ "grad_norm": 1.642712116241455,
2089
+ "learning_rate": 1.4806061923542747e-05,
2090
+ "loss": 0.1027,
2091
+ "step": 26400
2092
+ },
2093
+ {
2094
+ "epoch": 2.394307657555907,
2095
+ "grad_norm": 1.6066936254501343,
2096
+ "learning_rate": 1.4588677370268002e-05,
2097
+ "loss": 0.1052,
2098
+ "step": 26500
2099
+ },
2100
+ {
2101
+ "epoch": 2.4033431217528802,
2102
+ "grad_norm": 1.7851406335830688,
2103
+ "learning_rate": 1.437129281699326e-05,
2104
+ "loss": 0.1029,
2105
+ "step": 26600
2106
+ },
2107
+ {
2108
+ "epoch": 2.412378585949853,
2109
+ "grad_norm": 1.9918655157089233,
2110
+ "learning_rate": 1.4153908263718516e-05,
2111
+ "loss": 0.1006,
2112
+ "step": 26700
2113
+ },
2114
+ {
2115
+ "epoch": 2.421414050146826,
2116
+ "grad_norm": 1.6415534019470215,
2117
+ "learning_rate": 1.3936523710443774e-05,
2118
+ "loss": 0.1038,
2119
+ "step": 26800
2120
+ },
2121
+ {
2122
+ "epoch": 2.4304495143437994,
2123
+ "grad_norm": 1.9253250360488892,
2124
+ "learning_rate": 1.3719139157169032e-05,
2125
+ "loss": 0.1024,
2126
+ "step": 26900
2127
+ },
2128
+ {
2129
+ "epoch": 2.4394849785407726,
2130
+ "grad_norm": 1.86326265335083,
2131
+ "learning_rate": 1.3501754603894287e-05,
2132
+ "loss": 0.1019,
2133
+ "step": 27000
2134
+ },
2135
+ {
2136
+ "epoch": 2.4394849785407726,
2137
+ "eval_loss": 0.09212099760770798,
2138
+ "eval_runtime": 88.0292,
2139
+ "eval_samples_per_second": 48.427,
2140
+ "eval_steps_per_second": 0.761,
2141
+ "eval_wer": 0.1000106258633514,
2142
+ "step": 27000
2143
+ },
2144
+ {
2145
+ "epoch": 2.448520442737746,
2146
+ "grad_norm": 1.7671024799346924,
2147
+ "learning_rate": 1.3284370050619545e-05,
2148
+ "loss": 0.1026,
2149
+ "step": 27100
2150
+ },
2151
+ {
2152
+ "epoch": 2.4575559069347186,
2153
+ "grad_norm": 1.7686715126037598,
2154
+ "learning_rate": 1.3066985497344802e-05,
2155
+ "loss": 0.1041,
2156
+ "step": 27200
2157
+ },
2158
+ {
2159
+ "epoch": 2.466591371131692,
2160
+ "grad_norm": 1.743655800819397,
2161
+ "learning_rate": 1.284960094407006e-05,
2162
+ "loss": 0.099,
2163
+ "step": 27300
2164
+ },
2165
+ {
2166
+ "epoch": 2.475626835328665,
2167
+ "grad_norm": 1.7912476062774658,
2168
+ "learning_rate": 1.2632216390795314e-05,
2169
+ "loss": 0.1034,
2170
+ "step": 27400
2171
+ },
2172
+ {
2173
+ "epoch": 2.484662299525638,
2174
+ "grad_norm": 1.5481427907943726,
2175
+ "learning_rate": 1.2414831837520572e-05,
2176
+ "loss": 0.1037,
2177
+ "step": 27500
2178
+ },
2179
+ {
2180
+ "epoch": 2.4936977637226114,
2181
+ "grad_norm": 1.5013809204101562,
2182
+ "learning_rate": 1.219744728424583e-05,
2183
+ "loss": 0.1028,
2184
+ "step": 27600
2185
+ },
2186
+ {
2187
+ "epoch": 2.5027332279195846,
2188
+ "grad_norm": 1.592502236366272,
2189
+ "learning_rate": 1.1980062730971087e-05,
2190
+ "loss": 0.1024,
2191
+ "step": 27700
2192
+ },
2193
+ {
2194
+ "epoch": 2.5117686921165574,
2195
+ "grad_norm": 1.6279585361480713,
2196
+ "learning_rate": 1.1762678177696345e-05,
2197
+ "loss": 0.1017,
2198
+ "step": 27800
2199
+ },
2200
+ {
2201
+ "epoch": 2.5208041563135306,
2202
+ "grad_norm": 1.718693733215332,
2203
+ "learning_rate": 1.15452936244216e-05,
2204
+ "loss": 0.0991,
2205
+ "step": 27900
2206
+ },
2207
+ {
2208
+ "epoch": 2.529839620510504,
2209
+ "grad_norm": 1.721211314201355,
2210
+ "learning_rate": 1.1327909071146858e-05,
2211
+ "loss": 0.1004,
2212
+ "step": 28000
2213
+ },
2214
+ {
2215
+ "epoch": 2.529839620510504,
2216
+ "eval_loss": 0.0911058560013771,
2217
+ "eval_runtime": 86.9208,
2218
+ "eval_samples_per_second": 49.045,
2219
+ "eval_steps_per_second": 0.771,
2220
+ "eval_wer": 0.09856550844756136,
2221
+ "step": 28000
2222
+ },
2223
+ {
2224
+ "epoch": 2.5388750847074766,
2225
+ "grad_norm": 1.708903193473816,
2226
+ "learning_rate": 1.1110524517872116e-05,
2227
+ "loss": 0.1032,
2228
+ "step": 28100
2229
+ },
2230
+ {
2231
+ "epoch": 2.54791054890445,
2232
+ "grad_norm": 1.6191095113754272,
2233
+ "learning_rate": 1.0893139964597372e-05,
2234
+ "loss": 0.1031,
2235
+ "step": 28200
2236
+ },
2237
+ {
2238
+ "epoch": 2.556946013101423,
2239
+ "grad_norm": 1.5952250957489014,
2240
+ "learning_rate": 1.0677929256855375e-05,
2241
+ "loss": 0.0991,
2242
+ "step": 28300
2243
+ },
2244
+ {
2245
+ "epoch": 2.565981477298396,
2246
+ "grad_norm": 1.8054704666137695,
2247
+ "learning_rate": 1.0460544703580633e-05,
2248
+ "loss": 0.0994,
2249
+ "step": 28400
2250
+ },
2251
+ {
2252
+ "epoch": 2.5750169414953694,
2253
+ "grad_norm": 1.4976806640625,
2254
+ "learning_rate": 1.024316015030589e-05,
2255
+ "loss": 0.0988,
2256
+ "step": 28500
2257
+ },
2258
+ {
2259
+ "epoch": 2.5840524056923426,
2260
+ "grad_norm": 1.6461458206176758,
2261
+ "learning_rate": 1.0025775597031147e-05,
2262
+ "loss": 0.0989,
2263
+ "step": 28600
2264
+ },
2265
+ {
2266
+ "epoch": 2.593087869889316,
2267
+ "grad_norm": 1.631536841392517,
2268
+ "learning_rate": 9.808391043756405e-06,
2269
+ "loss": 0.1001,
2270
+ "step": 28700
2271
+ },
2272
+ {
2273
+ "epoch": 2.6021233340862886,
2274
+ "grad_norm": 1.8152861595153809,
2275
+ "learning_rate": 9.59100649048166e-06,
2276
+ "loss": 0.1001,
2277
+ "step": 28800
2278
+ },
2279
+ {
2280
+ "epoch": 2.611158798283262,
2281
+ "grad_norm": 1.4996885061264038,
2282
+ "learning_rate": 9.373621937206918e-06,
2283
+ "loss": 0.103,
2284
+ "step": 28900
2285
+ },
2286
+ {
2287
+ "epoch": 2.620194262480235,
2288
+ "grad_norm": 1.8811280727386475,
2289
+ "learning_rate": 9.156237383932176e-06,
2290
+ "loss": 0.0992,
2291
+ "step": 29000
2292
+ },
2293
+ {
2294
+ "epoch": 2.620194262480235,
2295
+ "eval_loss": 0.09040974825620651,
2296
+ "eval_runtime": 87.3549,
2297
+ "eval_samples_per_second": 48.801,
2298
+ "eval_steps_per_second": 0.767,
2299
+ "eval_wer": 0.0979917118265859,
2300
+ "step": 29000
2301
+ },
2302
+ {
2303
+ "epoch": 2.629229726677208,
2304
+ "grad_norm": 1.550436019897461,
2305
+ "learning_rate": 8.938852830657433e-06,
2306
+ "loss": 0.0997,
2307
+ "step": 29100
2308
+ },
2309
+ {
2310
+ "epoch": 2.638265190874181,
2311
+ "grad_norm": 1.7116386890411377,
2312
+ "learning_rate": 8.721468277382689e-06,
2313
+ "loss": 0.1021,
2314
+ "step": 29200
2315
+ },
2316
+ {
2317
+ "epoch": 2.647300655071154,
2318
+ "grad_norm": 1.8250106573104858,
2319
+ "learning_rate": 8.504083724107947e-06,
2320
+ "loss": 0.0992,
2321
+ "step": 29300
2322
+ },
2323
+ {
2324
+ "epoch": 2.6563361192681274,
2325
+ "grad_norm": 1.704163670539856,
2326
+ "learning_rate": 8.286699170833203e-06,
2327
+ "loss": 0.0974,
2328
+ "step": 29400
2329
+ },
2330
+ {
2331
+ "epoch": 2.6653715834651006,
2332
+ "grad_norm": 1.7405962944030762,
2333
+ "learning_rate": 8.06931461755846e-06,
2334
+ "loss": 0.0997,
2335
+ "step": 29500
2336
+ },
2337
+ {
2338
+ "epoch": 2.674407047662074,
2339
+ "grad_norm": 1.599592685699463,
2340
+ "learning_rate": 7.851930064283716e-06,
2341
+ "loss": 0.0978,
2342
+ "step": 29600
2343
+ },
2344
+ {
2345
+ "epoch": 2.683442511859047,
2346
+ "grad_norm": 1.666237711906433,
2347
+ "learning_rate": 7.634545511008974e-06,
2348
+ "loss": 0.0986,
2349
+ "step": 29700
2350
+ },
2351
+ {
2352
+ "epoch": 2.69247797605602,
2353
+ "grad_norm": 1.6730016469955444,
2354
+ "learning_rate": 7.417160957734231e-06,
2355
+ "loss": 0.0958,
2356
+ "step": 29800
2357
+ },
2358
+ {
2359
+ "epoch": 2.701513440252993,
2360
+ "grad_norm": 1.800661325454712,
2361
+ "learning_rate": 7.199776404459488e-06,
2362
+ "loss": 0.0967,
2363
+ "step": 29900
2364
+ },
2365
+ {
2366
+ "epoch": 2.710548904449966,
2367
+ "grad_norm": 1.4267141819000244,
2368
+ "learning_rate": 6.982391851184745e-06,
2369
+ "loss": 0.1011,
2370
+ "step": 30000
2371
+ },
2372
+ {
2373
+ "epoch": 2.710548904449966,
2374
+ "eval_loss": 0.08978110551834106,
2375
+ "eval_runtime": 92.3004,
2376
+ "eval_samples_per_second": 46.186,
2377
+ "eval_steps_per_second": 0.726,
2378
+ "eval_wer": 0.09784294973966635,
2379
+ "step": 30000
2380
+ },
2381
+ {
2382
+ "epoch": 2.719584368646939,
2383
+ "grad_norm": 1.7578014135360718,
2384
+ "learning_rate": 6.765007297910002e-06,
2385
+ "loss": 0.0988,
2386
+ "step": 30100
2387
+ },
2388
+ {
2389
+ "epoch": 2.728619832843912,
2390
+ "grad_norm": 1.747879981994629,
2391
+ "learning_rate": 6.54762274463526e-06,
2392
+ "loss": 0.0982,
2393
+ "step": 30200
2394
+ },
2395
+ {
2396
+ "epoch": 2.7376552970408854,
2397
+ "grad_norm": 1.4880852699279785,
2398
+ "learning_rate": 6.330238191360516e-06,
2399
+ "loss": 0.0944,
2400
+ "step": 30300
2401
+ },
2402
+ {
2403
+ "epoch": 2.7466907612378586,
2404
+ "grad_norm": 1.6102066040039062,
2405
+ "learning_rate": 6.112853638085773e-06,
2406
+ "loss": 0.099,
2407
+ "step": 30400
2408
+ },
2409
+ {
2410
+ "epoch": 2.755726225434832,
2411
+ "grad_norm": 2.1802284717559814,
2412
+ "learning_rate": 5.89546908481103e-06,
2413
+ "loss": 0.0963,
2414
+ "step": 30500
2415
+ },
2416
+ {
2417
+ "epoch": 2.764761689631805,
2418
+ "grad_norm": 1.65652334690094,
2419
+ "learning_rate": 5.680258377069035e-06,
2420
+ "loss": 0.099,
2421
+ "step": 30600
2422
+ },
2423
+ {
2424
+ "epoch": 2.7737971538287782,
2425
+ "grad_norm": 1.344401240348816,
2426
+ "learning_rate": 5.462873823794291e-06,
2427
+ "loss": 0.0979,
2428
+ "step": 30700
2429
+ },
2430
+ {
2431
+ "epoch": 2.782832618025751,
2432
+ "grad_norm": 1.6446696519851685,
2433
+ "learning_rate": 5.245489270519548e-06,
2434
+ "loss": 0.0944,
2435
+ "step": 30800
2436
+ },
2437
+ {
2438
+ "epoch": 2.791868082222724,
2439
+ "grad_norm": 1.529815435409546,
2440
+ "learning_rate": 5.028104717244806e-06,
2441
+ "loss": 0.0967,
2442
+ "step": 30900
2443
+ },
2444
+ {
2445
+ "epoch": 2.8009035464196974,
2446
+ "grad_norm": 1.7729915380477905,
2447
+ "learning_rate": 4.810720163970063e-06,
2448
+ "loss": 0.095,
2449
+ "step": 31000
2450
+ },
2451
+ {
2452
+ "epoch": 2.8009035464196974,
2453
+ "eval_loss": 0.08919844031333923,
2454
+ "eval_runtime": 90.4055,
2455
+ "eval_samples_per_second": 47.154,
2456
+ "eval_steps_per_second": 0.741,
2457
+ "eval_wer": 0.09748167038571884,
2458
+ "step": 31000
2459
+ },
2460
+ {
2461
+ "epoch": 2.80993901061667,
2462
+ "grad_norm": 1.6226630210876465,
2463
+ "learning_rate": 4.59333561069532e-06,
2464
+ "loss": 0.0982,
2465
+ "step": 31100
2466
+ },
2467
+ {
2468
+ "epoch": 2.8189744748136434,
2469
+ "grad_norm": 1.5628806352615356,
2470
+ "learning_rate": 4.375951057420576e-06,
2471
+ "loss": 0.095,
2472
+ "step": 31200
2473
+ },
2474
+ {
2475
+ "epoch": 2.8280099390106166,
2476
+ "grad_norm": 1.5284922122955322,
2477
+ "learning_rate": 4.158566504145834e-06,
2478
+ "loss": 0.0945,
2479
+ "step": 31300
2480
+ },
2481
+ {
2482
+ "epoch": 2.83704540320759,
2483
+ "grad_norm": 1.9399908781051636,
2484
+ "learning_rate": 3.941181950871091e-06,
2485
+ "loss": 0.0954,
2486
+ "step": 31400
2487
+ },
2488
+ {
2489
+ "epoch": 2.846080867404563,
2490
+ "grad_norm": 1.7431321144104004,
2491
+ "learning_rate": 3.7237973975963476e-06,
2492
+ "loss": 0.0973,
2493
+ "step": 31500
2494
+ },
2495
+ {
2496
+ "epoch": 2.855116331601536,
2497
+ "grad_norm": 1.4165501594543457,
2498
+ "learning_rate": 3.5064128443216044e-06,
2499
+ "loss": 0.0954,
2500
+ "step": 31600
2501
+ },
2502
+ {
2503
+ "epoch": 2.8641517957985094,
2504
+ "grad_norm": 1.8231940269470215,
2505
+ "learning_rate": 3.2890282910468617e-06,
2506
+ "loss": 0.0969,
2507
+ "step": 31700
2508
+ },
2509
+ {
2510
+ "epoch": 2.873187259995482,
2511
+ "grad_norm": 1.9092686176300049,
2512
+ "learning_rate": 3.0716437377721185e-06,
2513
+ "loss": 0.0967,
2514
+ "step": 31800
2515
+ },
2516
+ {
2517
+ "epoch": 2.8822227241924554,
2518
+ "grad_norm": 1.6101560592651367,
2519
+ "learning_rate": 2.8542591844973753e-06,
2520
+ "loss": 0.0973,
2521
+ "step": 31900
2522
+ },
2523
+ {
2524
+ "epoch": 2.8912581883894286,
2525
+ "grad_norm": 1.6077231168746948,
2526
+ "learning_rate": 2.636874631222633e-06,
2527
+ "loss": 0.0975,
2528
+ "step": 32000
2529
+ },
2530
+ {
2531
+ "epoch": 2.8912581883894286,
2532
+ "eval_loss": 0.08852633088827133,
2533
+ "eval_runtime": 88.8694,
2534
+ "eval_samples_per_second": 47.969,
2535
+ "eval_steps_per_second": 0.754,
2536
+ "eval_wer": 0.096015301243226,
2537
+ "step": 32000
2538
+ },
2539
+ {
2540
+ "epoch": 2.9002936525864014,
2541
+ "grad_norm": 1.6472060680389404,
2542
+ "learning_rate": 2.4194900779478898e-06,
2543
+ "loss": 0.0953,
2544
+ "step": 32100
2545
+ },
2546
+ {
2547
+ "epoch": 2.9093291167833746,
2548
+ "grad_norm": 1.5193005800247192,
2549
+ "learning_rate": 2.2021055246731466e-06,
2550
+ "loss": 0.0947,
2551
+ "step": 32200
2552
+ },
2553
+ {
2554
+ "epoch": 2.918364580980348,
2555
+ "grad_norm": 1.3484536409378052,
2556
+ "learning_rate": 1.984720971398404e-06,
2557
+ "loss": 0.0937,
2558
+ "step": 32300
2559
+ },
2560
+ {
2561
+ "epoch": 2.927400045177321,
2562
+ "grad_norm": 1.6725506782531738,
2563
+ "learning_rate": 1.7673364181236606e-06,
2564
+ "loss": 0.0949,
2565
+ "step": 32400
2566
+ },
2567
+ {
2568
+ "epoch": 2.936435509374294,
2569
+ "grad_norm": 1.5670363903045654,
2570
+ "learning_rate": 1.5499518648489175e-06,
2571
+ "loss": 0.0928,
2572
+ "step": 32500
2573
+ },
2574
+ {
2575
+ "epoch": 2.9454709735712674,
2576
+ "grad_norm": 1.5655218362808228,
2577
+ "learning_rate": 1.3325673115741747e-06,
2578
+ "loss": 0.0923,
2579
+ "step": 32600
2580
+ },
2581
+ {
2582
+ "epoch": 2.95450643776824,
2583
+ "grad_norm": 1.7287861108779907,
2584
+ "learning_rate": 1.1151827582994317e-06,
2585
+ "loss": 0.0945,
2586
+ "step": 32700
2587
+ },
2588
+ {
2589
+ "epoch": 2.9635419019652134,
2590
+ "grad_norm": 1.5101486444473267,
2591
+ "learning_rate": 8.977982050246885e-07,
2592
+ "loss": 0.0938,
2593
+ "step": 32800
2594
+ },
2595
+ {
2596
+ "epoch": 2.9725773661621866,
2597
+ "grad_norm": 1.4109468460083008,
2598
+ "learning_rate": 6.804136517499457e-07,
2599
+ "loss": 0.091,
2600
+ "step": 32900
2601
+ },
2602
+ {
2603
+ "epoch": 2.98161283035916,
2604
+ "grad_norm": 1.537053108215332,
2605
+ "learning_rate": 4.6302909847520263e-07,
2606
+ "loss": 0.0963,
2607
+ "step": 33000
2608
+ },
2609
+ {
2610
+ "epoch": 2.98161283035916,
2611
+ "eval_loss": 0.08801376074552536,
2612
+ "eval_runtime": 90.4227,
2613
+ "eval_samples_per_second": 47.145,
2614
+ "eval_steps_per_second": 0.741,
2615
+ "eval_wer": 0.09624907023695675,
2616
+ "step": 33000
2617
+ },
2618
+ {
2619
+ "epoch": 2.9906482945561326,
2620
+ "grad_norm": 1.575260043144226,
2621
+ "learning_rate": 2.47818390733207e-07,
2622
+ "loss": 0.0934,
2623
+ "step": 33100
2624
+ },
2625
+ {
2626
+ "epoch": 2.9996837587531058,
2627
+ "grad_norm": 1.5032224655151367,
2628
+ "learning_rate": 3.043383745846402e-08,
2629
+ "loss": 0.0941,
2630
+ "step": 33200
2631
+ },
2632
+ {
2633
+ "epoch": 2.999774113395076,
2634
+ "step": 33201,
2635
+ "total_flos": 2.756290459511145e+20,
2636
+ "train_loss": 0.20740618139383307,
2637
+ "train_runtime": 51184.2268,
2638
+ "train_samples_per_second": 83.03,
2639
+ "train_steps_per_second": 0.649
2640
+ }
2641
+ ],
2642
+ "logging_steps": 100,
2643
+ "max_steps": 33201,
2644
+ "num_input_tokens_seen": 0,
2645
+ "num_train_epochs": 3,
2646
+ "save_steps": 1000,
2647
+ "stateful_callbacks": {
2648
+ "TrainerControl": {
2649
+ "args": {
2650
+ "should_epoch_stop": false,
2651
+ "should_evaluate": false,
2652
+ "should_log": false,
2653
+ "should_save": true,
2654
+ "should_training_stop": true
2655
+ },
2656
+ "attributes": {}
2657
+ }
2658
+ },
2659
+ "total_flos": 2.756290459511145e+20,
2660
+ "train_batch_size": 32,
2661
+ "trial_name": null,
2662
+ "trial_params": null
2663
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7d812729698c04b3c2d820194f54b82a091c70a274b5890fd475aef73de38a05
3
+ size 5496
vocab.json ADDED
The diff for this file is too large to render. See raw diff