RubielLabarta commited on
Commit
962d412
·
verified ·
1 Parent(s): 3d9523c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -3
README.md CHANGED
@@ -13,8 +13,7 @@ base_model:
13
 
14
  # LogoS-7Bx2-MoE-13B-v0.1
15
 
16
- Fine-tuned model on English and Spanish language using MoE method.
17
- Model description
18
 
19
- The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters, and this model is fine-tuned with The Pile.
20
 
 
13
 
14
  # LogoS-7Bx2-MoE-13B-v0.1
15
 
16
+ Model built by @RubielLabarta using SLERP merge method. The model is release for research purposes only, commercial use is not allowed.
 
17
 
18
+ The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters.
19