Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mobiuslabsgmbh
/
Mixtral-8x7B-v0.1-hf-2bit_g16_s128-HQQ
like
4
Follow
Mobius Labs GmbH
59
Text Generation
Transformers
mixtral
Mixture of Experts
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
Librarian Bot: Add moe tag to model
#1
by
librarian-bot
- opened
Jan 8, 2024
base:
refs/heads/main
←
from:
refs/pr/1
Discussion
Files changed
+2
-0
Files changed (1)
hide
show
README.md
+2
-0
README.md
CHANGED
Viewed
@@ -1,5 +1,7 @@
1
---
2
license: apache-2.0
3
train: false
4
inference: false
5
pipeline_tag: text-generation
1
---
2
license: apache-2.0
3
+
tags:
4
+
- moe
5
train: false
6
inference: false
7
pipeline_tag: text-generation