Typhoon 2 Text
Collection
Latest Official Text ThaiLLM release by SCB 10X.
โข
11 items
โข
Updated
โข
2
Llama3.1-Typhoon2-70B: Thai Large Language Model (Instruct)
Llama3.1-Typhoon2-70B is a pretrained only Thai ๐น๐ญ large language model with 70 billion parameters, and it is based on Llama3.1-70B.
For technical-report. please see our arxiv. *To acknowledge Meta's effort in creating the foundation model and to comply with the license, we explicitly include "llama-3.1" in the model name.
Model | ThaiExam | ONET | IC | A-Level | TGAT | TPAT | M3Exam | Math | Science | Social | Thai |
---|---|---|---|---|---|---|---|---|---|---|---|
Typhoon1.5x 70B instruct | 62.96% | 60.49% | 71.57% | 53.54% | 72.30% | 56.89% | 62.54% | 45.70% | 62.56% | 77.73% | 64.19% |
Llama3.1 70B | 60.74% | 62.34% | 67.36% | 53.54% | 66.15% | 54.31% | 60.35% | 38.91% | 62.56% | 76.99% | 62.96% |
Typhoon2 Llama 3.1 70B base | 63.39% | 65.43% | 69.47% | 59.84% | 66.15% | 56.03% | 62.33% | 42.98% | 63.28% | 78.60% | 64.47% |
This model is a pretrained base model. Thus, it may not be able to follow human instructions without using one/few-shot learning or instruction fine-tuning. The model does not have any moderation mechanisms, and may generate harmful or inappropriate responses.
https://twitter.com/opentyphoon
@misc{typhoon2,
title={Typhoon 2: A Family of Open Text and Multimodal Thai Large Language Models},
author={Kunat Pipatanakul and Potsawee Manakul and Natapong Nitarach and Warit Sirichotedumrong and Surapon Nonesung and Teetouch Jaknamon and Parinthapat Pengpun and Pittawat Taveekitworachai and Adisai Na-Thalang and Sittipong Sripaisarnmongkol and Krisanapong Jirayoot and Kasima Tharnpipitchai},
year={2024},
eprint={2412.13702},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2412.13702},
}