File size: 330 Bytes
fb5dbbf
bbc9cc3
 
 
 
 
 
 
 
 
 
b09859c
a1f67c1
021940d
 
 
b09859c
8537db5
1edb99a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
---
license: apache-2.0
language:
- fr
- it
- de
- es
- en
tags:
- moe
---
        
| Name              |   Age |
|-------------------|-------|
| Alice raamaowiejb |    24 |
| Bob               |    19 |

# Model Card for Mixtral-8x7B
The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Expert