File size: 582 Bytes
620b9f3 d7e45b3 a31df66 620b9f3 d7e45b3 620b9f3 d7e45b3 620b9f3 a31df66 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
---
library_name: transformers
tags:
- Mixtral 8x7B
- Mistral
- merge
- moe
license: apache-2.0
---
<img src="https://huggingface.co./aigeek0x0/radiantloom-mixtral-8x7b-fusion/resolve/main/Radiantloom-Mixtral-8x7B-Fusion.png" alt="Radiantloom Mixtral 8X7B Fusion" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
## Radiantloom Mixtral 8X7B Fusion DPO
This model is a finetuned version of [Radiantloom Mixtral 8X7B Fusion](https://huggingface.co./Radiantloom/radiantloom-mixtral-8x7b-fusion). It was finetuned using Direct Preference Optimization (DPO). |