masa_model_test / README.md
momori-chegg's picture
Upload README.md with huggingface_hub
1edb99a verified
---
license: apache-2.0
language:
- fr
- it
- de
- es
- en
tags:
- moe
---
| Name | Age |
|-------------------|-------|
| Alice raamaowiejb | 24 |
| Bob | 19 |
# Model Card for Mixtral-8x7B
The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Expert