This is fully exp model!
base_model: beomi/EXAONE-3.5-2.4B-Instruct-Llamafied
gate_mode: random
architecture: mixtral
experts_per_token: 2
dtype: bfloat16
experts:
- source_model: beomi/EXAONE-3.5-2.4B-Instruct-Llamafied
- source_model: unsloth/Phi-3.5-mini-instruct
- Downloads last month
- 8
Inference API (serverless) does not yet support transformers models for this pipeline type.