RubielLabarta
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -13,8 +13,7 @@ base_model:
|
|
13 |
|
14 |
# LogoS-7Bx2-MoE-13B-v0.1
|
15 |
|
16 |
-
|
17 |
-
Model description
|
18 |
|
19 |
-
The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters
|
20 |
|
|
|
13 |
|
14 |
# LogoS-7Bx2-MoE-13B-v0.1
|
15 |
|
16 |
+
Model built by @RubielLabarta using SLERP merge method. The model is release for research purposes only, commercial use is not allowed.
|
|
|
17 |
|
18 |
+
The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters.
|
19 |
|