Update README.md
Browse files
README.md
CHANGED
@@ -23,6 +23,7 @@ ________________________________________________________________________________
|
|
23 |
This model is a TIES merger of notux-8x7b-v1 and UNAversal-8x7B-v1beta with MixtralOrochi8x7B being the Base model.
|
24 |
|
25 |
I was very impressed with MixtralOrochi8x7B performance and multifaceted usecases as it is already a merger of many usefull Mixtral models such as Mixtral instruct,
|
|
|
26 |
Noromaid-v0.1-mixtral, openbuddy-mixtral and possibly other models that were not named. My goal was to expand the models capabilities and make it even more useful of a model, maybe even competitive with closed source models like Gpt-4. But for that more testing is required. I hope the community can help me determine if its deserving of its name. 😊
|
27 |
|
28 |
Base model:
|
@@ -40,7 +41,7 @@ Instruct template: Alpaca
|
|
40 |
|
41 |
|
42 |
Merger config:
|
43 |
-
```
|
44 |
models:
|
45 |
- model: notux-8x7b-v1
|
46 |
parameters:
|
|
|
23 |
This model is a TIES merger of notux-8x7b-v1 and UNAversal-8x7B-v1beta with MixtralOrochi8x7B being the Base model.
|
24 |
|
25 |
I was very impressed with MixtralOrochi8x7B performance and multifaceted usecases as it is already a merger of many usefull Mixtral models such as Mixtral instruct,
|
26 |
+
|
27 |
Noromaid-v0.1-mixtral, openbuddy-mixtral and possibly other models that were not named. My goal was to expand the models capabilities and make it even more useful of a model, maybe even competitive with closed source models like Gpt-4. But for that more testing is required. I hope the community can help me determine if its deserving of its name. 😊
|
28 |
|
29 |
Base model:
|
|
|
41 |
|
42 |
|
43 |
Merger config:
|
44 |
+
```yaml
|
45 |
models:
|
46 |
- model: notux-8x7b-v1
|
47 |
parameters:
|