Update README.md
Browse files
README.md
CHANGED
@@ -33,6 +33,8 @@ Medorca-2x7b is a Mixure of Experts (MoE) made with the following models:
|
|
33 |
| HellaSwag | 76.04 | **76.19** | | | |
|
34 |
| Winogrande | **74.51** | 73.48 | | | |
|
35 |
|
|
|
|
|
36 |
## 🧩 Configuration
|
37 |
|
38 |
```yaml
|
|
|
33 |
| HellaSwag | 76.04 | **76.19** | | | |
|
34 |
| Winogrande | **74.51** | 73.48 | | | |
|
35 |
|
36 |
+
More details on the Open LLM Leaderboard evaluation results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medorca-2x7b.)
|
37 |
+
|
38 |
## 🧩 Configuration
|
39 |
|
40 |
```yaml
|