v4_llama_lora
This model is a fine-tuned version of Daewon0808/prm800k_llama_fulltune on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1835
- Prm accuracy: 0.9216
- Prm precision: 0.9710
- Prm recall: 0.9178
- Prm specificty: 0.9310
- Prm npv: 0.8182
- Prm f1: 0.9437
- Prm f1 neg: 0.8710
- Prm f1 auc: 0.9244
- Prm f1 auc (fixed): 0.9870
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 2
- eval_batch_size: 4
- seed: 908932403
- distributed_type: multi-GPU
- num_devices: 8
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss | Prm accuracy | Prm precision | Prm recall | Prm specificty | Prm npv | Prm f1 | Prm f1 neg | Prm f1 auc | Prm f1 auc (fixed) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 0 | 0 | 0.2324 | 0.9216 | 0.9577 | 0.9315 | 0.8966 | 0.8387 | 0.9444 | 0.8667 | 0.9140 | 0.9752 |
0.3644 | 0.0113 | 5 | 0.2326 | 0.9216 | 0.9577 | 0.9315 | 0.8966 | 0.8387 | 0.9444 | 0.8667 | 0.9140 | 0.9745 |
0.4155 | 0.0225 | 10 | 0.2321 | 0.9216 | 0.9577 | 0.9315 | 0.8966 | 0.8387 | 0.9444 | 0.8667 | 0.9140 | 0.9761 |
0.3621 | 0.0338 | 15 | 0.2313 | 0.9216 | 0.9577 | 0.9315 | 0.8966 | 0.8387 | 0.9444 | 0.8667 | 0.9140 | 0.9764 |
0.3158 | 0.0450 | 20 | 0.2311 | 0.9118 | 0.9571 | 0.9178 | 0.8966 | 0.8125 | 0.9371 | 0.8525 | 0.9072 | 0.9787 |
0.2875 | 0.0563 | 25 | 0.2320 | 0.8824 | 0.9552 | 0.8767 | 0.8966 | 0.7429 | 0.9143 | 0.8125 | 0.8866 | 0.9828 |
0.3092 | 0.0675 | 30 | 0.2270 | 0.8922 | 0.9697 | 0.8767 | 0.9310 | 0.75 | 0.9209 | 0.8308 | 0.9039 | 0.9854 |
0.2707 | 0.0788 | 35 | 0.2590 | 0.8824 | 0.9692 | 0.8630 | 0.9310 | 0.7297 | 0.9130 | 0.8182 | 0.8970 | 0.9823 |
0.2569 | 0.0900 | 40 | 0.2061 | 0.9216 | 0.9577 | 0.9315 | 0.8966 | 0.8387 | 0.9444 | 0.8667 | 0.9140 | 0.9846 |
0.2094 | 0.1013 | 45 | 0.2621 | 0.8529 | 0.9833 | 0.8082 | 0.9655 | 0.6667 | 0.8872 | 0.7887 | 0.8869 | 0.9842 |
0.2567 | 0.1125 | 50 | 0.2013 | 0.8922 | 0.9559 | 0.8904 | 0.8966 | 0.7647 | 0.9220 | 0.8254 | 0.8935 | 0.9821 |
0.2316 | 0.1238 | 55 | 0.2282 | 0.8922 | 0.9844 | 0.8630 | 0.9655 | 0.7368 | 0.9197 | 0.8358 | 0.9143 | 0.9846 |
0.1892 | 0.1350 | 60 | 0.1978 | 0.8922 | 0.9697 | 0.8767 | 0.9310 | 0.75 | 0.9209 | 0.8308 | 0.9039 | 0.9839 |
0.2541 | 0.1463 | 65 | 0.2150 | 0.8824 | 0.9692 | 0.8630 | 0.9310 | 0.7297 | 0.9130 | 0.8182 | 0.8970 | 0.9830 |
0.1987 | 0.1575 | 70 | 0.2332 | 0.8824 | 0.9692 | 0.8630 | 0.9310 | 0.7297 | 0.9130 | 0.8182 | 0.8970 | 0.9785 |
0.1965 | 0.1688 | 75 | 0.2106 | 0.8824 | 0.9692 | 0.8630 | 0.9310 | 0.7297 | 0.9130 | 0.8182 | 0.8970 | 0.9839 |
0.2652 | 0.1800 | 80 | 0.1784 | 0.9118 | 0.9706 | 0.9041 | 0.9310 | 0.7941 | 0.9362 | 0.8571 | 0.9176 | 0.9846 |
0.1822 | 0.1913 | 85 | 0.2263 | 0.8627 | 0.9683 | 0.8356 | 0.9310 | 0.6923 | 0.8971 | 0.7941 | 0.8833 | 0.9851 |
0.2278 | 0.2025 | 90 | 0.1838 | 0.9118 | 0.9444 | 0.9315 | 0.8621 | 0.8333 | 0.9379 | 0.8475 | 0.8968 | 0.9849 |
0.2312 | 0.2138 | 95 | 0.2618 | 0.8529 | 0.9677 | 0.8219 | 0.9310 | 0.675 | 0.8889 | 0.7826 | 0.8765 | 0.9842 |
0.2253 | 0.2250 | 100 | 0.2169 | 0.9020 | 0.9437 | 0.9178 | 0.8621 | 0.8065 | 0.9306 | 0.8333 | 0.8899 | 0.9804 |
0.2599 | 0.2363 | 105 | 0.2673 | 0.8529 | 0.9677 | 0.8219 | 0.9310 | 0.675 | 0.8889 | 0.7826 | 0.8765 | 0.9858 |
0.2003 | 0.2475 | 110 | 0.2062 | 0.9118 | 0.9571 | 0.9178 | 0.8966 | 0.8125 | 0.9371 | 0.8525 | 0.9072 | 0.9875 |
0.2161 | 0.2588 | 115 | 0.1936 | 0.9118 | 0.9571 | 0.9178 | 0.8966 | 0.8125 | 0.9371 | 0.8525 | 0.9072 | 0.9887 |
0.184 | 0.2700 | 120 | 0.2195 | 0.9118 | 0.9848 | 0.8904 | 0.9655 | 0.7778 | 0.9353 | 0.8615 | 0.9280 | 0.9875 |
0.2144 | 0.2813 | 125 | 0.1807 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9868 |
0.2033 | 0.2925 | 130 | 0.2090 | 0.8922 | 0.9697 | 0.8767 | 0.9310 | 0.75 | 0.9209 | 0.8308 | 0.9039 | 0.9856 |
0.1829 | 0.3038 | 135 | 0.2103 | 0.8824 | 0.9552 | 0.8767 | 0.8966 | 0.7429 | 0.9143 | 0.8125 | 0.8866 | 0.9804 |
0.1789 | 0.3150 | 140 | 0.2007 | 0.8922 | 0.9429 | 0.9041 | 0.8621 | 0.7812 | 0.9231 | 0.8197 | 0.8831 | 0.9780 |
0.2022 | 0.3263 | 145 | 0.2330 | 0.9020 | 0.9846 | 0.8767 | 0.9655 | 0.7568 | 0.9275 | 0.8485 | 0.9211 | 0.9837 |
0.1567 | 0.3376 | 150 | 0.2110 | 0.9118 | 0.9706 | 0.9041 | 0.9310 | 0.7941 | 0.9362 | 0.8571 | 0.9176 | 0.9849 |
0.2176 | 0.3488 | 155 | 0.2058 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9844 |
0.1977 | 0.3601 | 160 | 0.2251 | 0.9020 | 0.9846 | 0.8767 | 0.9655 | 0.7568 | 0.9275 | 0.8485 | 0.9211 | 0.9851 |
0.1499 | 0.3713 | 165 | 0.1884 | 0.9118 | 0.9706 | 0.9041 | 0.9310 | 0.7941 | 0.9362 | 0.8571 | 0.9176 | 0.9849 |
0.1863 | 0.3826 | 170 | 0.1959 | 0.9020 | 0.9701 | 0.8904 | 0.9310 | 0.7714 | 0.9286 | 0.8438 | 0.9107 | 0.9854 |
0.1885 | 0.3938 | 175 | 0.1797 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9870 |
0.2021 | 0.4051 | 180 | 0.2044 | 0.8922 | 0.9697 | 0.8767 | 0.9310 | 0.75 | 0.9209 | 0.8308 | 0.9039 | 0.9887 |
0.2046 | 0.4163 | 185 | 0.1975 | 0.8922 | 0.9697 | 0.8767 | 0.9310 | 0.75 | 0.9209 | 0.8308 | 0.9039 | 0.9887 |
0.1873 | 0.4276 | 190 | 0.1884 | 0.9118 | 0.9706 | 0.9041 | 0.9310 | 0.7941 | 0.9362 | 0.8571 | 0.9176 | 0.9872 |
0.159 | 0.4388 | 195 | 0.2089 | 0.9020 | 0.9701 | 0.8904 | 0.9310 | 0.7714 | 0.9286 | 0.8438 | 0.9107 | 0.9858 |
0.1659 | 0.4501 | 200 | 0.2153 | 0.9020 | 0.9701 | 0.8904 | 0.9310 | 0.7714 | 0.9286 | 0.8438 | 0.9107 | 0.9858 |
0.1817 | 0.4613 | 205 | 0.2348 | 0.8725 | 0.9688 | 0.8493 | 0.9310 | 0.7105 | 0.9051 | 0.8060 | 0.8902 | 0.9849 |
0.1905 | 0.4726 | 210 | 0.2168 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9846 |
0.1836 | 0.4838 | 215 | 0.1996 | 0.9118 | 0.9706 | 0.9041 | 0.9310 | 0.7941 | 0.9362 | 0.8571 | 0.9176 | 0.9830 |
0.1858 | 0.4951 | 220 | 0.2070 | 0.9020 | 0.9701 | 0.8904 | 0.9310 | 0.7714 | 0.9286 | 0.8438 | 0.9107 | 0.9839 |
0.1848 | 0.5063 | 225 | 0.2011 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9818 |
0.1811 | 0.5176 | 230 | 0.2117 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9802 |
0.2245 | 0.5288 | 235 | 0.2120 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9821 |
0.1777 | 0.5401 | 240 | 0.2104 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9846 |
0.1734 | 0.5513 | 245 | 0.2032 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9842 |
0.2164 | 0.5626 | 250 | 0.1971 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9825 |
0.2144 | 0.5738 | 255 | 0.2201 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9846 |
0.1975 | 0.5851 | 260 | 0.2109 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9844 |
0.2156 | 0.5963 | 265 | 0.2031 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9846 |
0.1782 | 0.6076 | 270 | 0.1987 | 0.9118 | 0.9706 | 0.9041 | 0.9310 | 0.7941 | 0.9362 | 0.8571 | 0.9176 | 0.9854 |
0.1407 | 0.6188 | 275 | 0.2158 | 0.9118 | 0.9706 | 0.9041 | 0.9310 | 0.7941 | 0.9362 | 0.8571 | 0.9176 | 0.9844 |
0.217 | 0.6301 | 280 | 0.2018 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9839 |
0.1692 | 0.6414 | 285 | 0.1782 | 0.9314 | 0.9714 | 0.9315 | 0.9310 | 0.8438 | 0.9510 | 0.8852 | 0.9313 | 0.9839 |
0.205 | 0.6526 | 290 | 0.2072 | 0.8922 | 0.9697 | 0.8767 | 0.9310 | 0.75 | 0.9209 | 0.8308 | 0.9039 | 0.9872 |
0.202 | 0.6639 | 295 | 0.2026 | 0.9020 | 0.9701 | 0.8904 | 0.9310 | 0.7714 | 0.9286 | 0.8438 | 0.9107 | 0.9887 |
0.2301 | 0.6751 | 300 | 0.1639 | 0.9314 | 0.9714 | 0.9315 | 0.9310 | 0.8438 | 0.9510 | 0.8852 | 0.9313 | 0.9882 |
0.191 | 0.6864 | 305 | 0.1622 | 0.9314 | 0.9714 | 0.9315 | 0.9310 | 0.8438 | 0.9510 | 0.8852 | 0.9313 | 0.9880 |
0.1106 | 0.6976 | 310 | 0.1938 | 0.9118 | 0.9706 | 0.9041 | 0.9310 | 0.7941 | 0.9362 | 0.8571 | 0.9176 | 0.9870 |
0.1775 | 0.7089 | 315 | 0.2062 | 0.9118 | 0.9706 | 0.9041 | 0.9310 | 0.7941 | 0.9362 | 0.8571 | 0.9176 | 0.9872 |
0.1925 | 0.7201 | 320 | 0.2050 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9863 |
0.1682 | 0.7314 | 325 | 0.1887 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9861 |
0.2101 | 0.7426 | 330 | 0.1903 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9856 |
0.1981 | 0.7539 | 335 | 0.1901 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9856 |
0.1825 | 0.7651 | 340 | 0.2022 | 0.9118 | 0.9706 | 0.9041 | 0.9310 | 0.7941 | 0.9362 | 0.8571 | 0.9176 | 0.9858 |
0.1738 | 0.7764 | 345 | 0.1987 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9868 |
0.201 | 0.7876 | 350 | 0.1846 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9854 |
0.1731 | 0.7989 | 355 | 0.1802 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9870 |
0.1312 | 0.8101 | 360 | 0.1851 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9875 |
0.1875 | 0.8214 | 365 | 0.1947 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9877 |
0.1893 | 0.8326 | 370 | 0.1997 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9877 |
0.2138 | 0.8439 | 375 | 0.2016 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9877 |
0.1851 | 0.8551 | 380 | 0.1967 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9875 |
0.1925 | 0.8664 | 385 | 0.1896 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9872 |
0.1852 | 0.8776 | 390 | 0.1849 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9872 |
0.1777 | 0.8889 | 395 | 0.1820 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9872 |
0.1501 | 0.9001 | 400 | 0.1812 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9872 |
0.1285 | 0.9114 | 405 | 0.1820 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9868 |
0.1392 | 0.9226 | 410 | 0.1804 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9868 |
0.1671 | 0.9339 | 415 | 0.1811 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9872 |
0.1544 | 0.9451 | 420 | 0.1828 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9875 |
0.2063 | 0.9564 | 425 | 0.1831 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9870 |
0.199 | 0.9677 | 430 | 0.1821 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9875 |
0.1551 | 0.9789 | 435 | 0.1846 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9872 |
0.1838 | 0.9902 | 440 | 0.1835 | 0.9216 | 0.9710 | 0.9178 | 0.9310 | 0.8182 | 0.9437 | 0.8710 | 0.9244 | 0.9870 |
Framework versions
- PEFT 0.12.0
- Transformers 4.46.0
- Pytorch 2.4.0+cu118
- Datasets 3.0.0
- Tokenizers 0.20.1
- Downloads last month
- 16
Model tree for Daewon0808/v4_llama_lora
Base model
meta-llama/Llama-3.1-8B
Finetuned
meta-llama/Llama-3.1-8B-Instruct
Finetuned
Daewon0808/prm800k_llama_fulltune