DopeorNope
commited on
Commit
·
e0503b7
1
Parent(s):
64c8ef9
Update README.md
Browse files
README.md
CHANGED
@@ -43,7 +43,7 @@ SOLARC-MOE-10.7Bx6 is an auto-regressive language model based on the SOLAR archi
|
|
43 |
|
44 |
I have built a model using the Mixture of Experts (MOE) approach, utilizing each of these models as the base.
|
45 |
|
46 |
-
I wanted to test if it was possible with a non-power of 2, like with 6
|
47 |
|
48 |
---
|
49 |
|
|
|
43 |
|
44 |
I have built a model using the Mixture of Experts (MOE) approach, utilizing each of these models as the base.
|
45 |
|
46 |
+
I wanted to test if it was possible to compile with a non-power of 2, like with 6
|
47 |
|
48 |
---
|
49 |
|