metadata
license: apache-2.0
tags:
- moe
language:
- en
library_name: transformers
Model Card for Model ID
This is a mixture of experts created with mergekit and based on mistralai/Mistral-7B-v0.1.
Model Details
Model Description
- Developed by: The Kaitchup
- Model type: Causal
- Language(s) (NLP): English
- License: Apache 2.0
The method and code used to quantize the model is explained here: Maixtchup: Make Your Own Mixture of Experts with Mergekit
Uses
This model is pre-trained and not fine-tuned. You may fine-tune it with PEFT using adapters.