Update README.md
Browse files
README.md
CHANGED
@@ -67,7 +67,9 @@ A small 101M param (total) decoder model. This is the first version of the model
|
|
67 |
- GQA (24 heads, 8 key-value), context length 1024
|
68 |
- train-from-scratch
|
69 |
|
70 |
-
|
|
|
|
|
71 |
|
72 |
- For the chat version of this model, please [see here](https://youtu.be/dQw4w9WgXcQ?si=3ePIqrY1dw94KMu4)
|
73 |
|
|
|
67 |
- GQA (24 heads, 8 key-value), context length 1024
|
68 |
- train-from-scratch
|
69 |
|
70 |
+
## Notes
|
71 |
+
|
72 |
+
**This checkpoint** is the 'raw' pre-trained model and has not been tuned to a more specific task. **It should be fine-tuned** before use in most cases.
|
73 |
|
74 |
- For the chat version of this model, please [see here](https://youtu.be/dQw4w9WgXcQ?si=3ePIqrY1dw94KMu4)
|
75 |
|