chargoddard
commited on
Merge branch 'main' of https://huggingface.co./chargoddard/llama33b-16k into main
Browse files
README.md
ADDED
@@ -0,0 +1,10 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
datasets:
|
3 |
+
- EleutherAI/wikitext_document_level
|
4 |
+
tags:
|
5 |
+
- llama
|
6 |
+
---
|
7 |
+
LLaMA 33b finetuned on `wikitext_document_level` with a linear ROPE scaling of 8, for a 16k token context length.
|
8 |
+
This is a merged version of [llama33b-16k-qlora](https://huggingface.co/chargoddard/llama33b-16k-qlora).
|
9 |
+
|
10 |
+
Note that this is *not* an instruct model - this is base LLaMA with an extended sequence length.
|