RubielLabarta
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -12,8 +12,6 @@ base_model:
|
|
12 |
- TomGrc/FusionNet_7Bx2_MoE_14B
|
13 |
---
|
14 |
|
15 |
-
|
16 |
-
|
17 |
# LogoS-7Bx2-MoE-13B-v0.1
|
18 |
|
19 |
Fine-tuned model on English and Spanish language using MoE method.
|
@@ -21,5 +19,3 @@ Model description
|
|
21 |
|
22 |
The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters, and this model is fine-tuned with The Pile.
|
23 |
|
24 |
-
base_model:
|
25 |
-
- SOLAR-10.7B-Instruct-v1.0
|
|
|
12 |
- TomGrc/FusionNet_7Bx2_MoE_14B
|
13 |
---
|
14 |
|
|
|
|
|
15 |
# LogoS-7Bx2-MoE-13B-v0.1
|
16 |
|
17 |
Fine-tuned model on English and Spanish language using MoE method.
|
|
|
19 |
|
20 |
The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters, and this model is fine-tuned with The Pile.
|
21 |
|
|
|
|