license: apache-2.0 | |
language: | |
- fr | |
- it | |
- de | |
- es | |
- en | |
tags: | |
- moe | |
| Name | Age | | |
|-------------------|-------| | |
| Alice raamaowiejb | 24 | | |
| Bob | 19 | | |
# Model Card for Mixtral-8x7B | |
The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Expert |