Update README.md
Browse files
README.md
CHANGED
@@ -10,6 +10,12 @@ tags:
|
|
10 |
|
11 |
A frankenMoE using only DPO models. To be used with Chat-instruct mode enabled. I will post the evaluations for it. :)
|
12 |
|
|
|
|
|
|
|
|
|
|
|
|
|
13 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
14 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
15 |
|
|
|
10 |
|
11 |
A frankenMoE using only DPO models. To be used with Chat-instruct mode enabled. I will post the evaluations for it. :)
|
12 |
|
13 |
+
- [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) - router
|
14 |
+
- [udkai/Turdus](https://huggingface.co/udkai/Turdus) - expert #1
|
15 |
+
- [distilabeled-Marcoro14-7B-slerp](https://huggingface.co/argilla/distilabeled-Marcoro14-7B-slerp) - expert #2
|
16 |
+
- [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) - expert #3
|
17 |
+
- [Neuronovo/neuronovo-9B-v0.3](https://huggingface.co/Neuronovo/neuronovo-9B-v0.3) - expert #4
|
18 |
+
|
19 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
20 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
21 |
|