sho-takase commited on
Commit
30348c3
1 Parent(s): 00c38ae

revise readme

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -41,7 +41,7 @@ We constructed this Sarashina2.1-1B model, which consists of 1 billion parameter
41
  First, we trained the model on 10 trillion tokens, including Japanese and English data extracted from web corpora.
42
  Then, we trained the model using 1 trillion tokens, predominantly consisting of Japanese data, to enhance its performance in Japanese.
43
  The following tables show the model's performance on Japanese and English tasks.
44
- We also show the performance of other public LLMs with approximately 1 billion parameters.
45
 
46
  #### Evaluation in Japanese tasks
47
 
 
41
  First, we trained the model on 10 trillion tokens, including Japanese and English data extracted from web corpora.
42
  Then, we trained the model using 1 trillion tokens, predominantly consisting of Japanese data, to enhance its performance in Japanese.
43
  The following tables show the model's performance on Japanese and English tasks.
44
+ We also show the performance of other public LLMs for reference.
45
 
46
  #### Evaluation in Japanese tasks
47