license: apache-2.0 language: - fr - it - de - es - en tags: - moe
The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Expert