xxx777xxxASD
commited on
Commit
•
96692eb
1
Parent(s):
c7fa25e
Update README.md
Browse files
README.md
CHANGED
@@ -6,9 +6,11 @@ tags:
|
|
6 |
- moe
|
7 |
---
|
8 |
|
9 |
-
|
10 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f5e51289c121cb864ba464/oHq8uPY_H6SC-sfA-Wx7L.png)
|
11 |
|
|
|
|
|
|
|
12 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than the Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
13 |
|
14 |
[GGUF, Exl2](https://huggingface.co/collections/xxx777xxxASD/chaoticsoliloquy-4x8b-6628a759b5a60d8d3f51ed62)
|
|
|
6 |
- moe
|
7 |
---
|
8 |
|
|
|
9 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f5e51289c121cb864ba464/oHq8uPY_H6SC-sfA-Wx7L.png)
|
10 |
|
11 |
+
> [!IMPORTANT]
|
12 |
+
> Check for [ChaoticSoliloquy-v1.5-4x8B](https://huggingface.co/xxx777xxxASD/L3-ChaoticSoliloquy-v1.5-4x8B)
|
13 |
+
|
14 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than the Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
15 |
|
16 |
[GGUF, Exl2](https://huggingface.co/collections/xxx777xxxASD/chaoticsoliloquy-4x8b-6628a759b5a60d8d3f51ed62)
|