RubielLabarta's picture
Update README.md
962d412 verified
|
raw
history blame
478 Bytes
metadata
language:
  - en
  - es
tags:
  - moe
  - merge
base_model:
  - yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B
  - TomGrc/FusionNet_7Bx2_MoE_14B

LogoS-7Bx2-MoE-13B-v0.1

Model built by @RubielLabarta using SLERP merge method. The model is release for research purposes only, commercial use is not allowed.

The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters.