Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mobiuslabsgmbh
/
Mixtral-8x7B-Instruct-v0.1-hf-attn-4bit-moe-2bit-HQQ
like
38
Follow
Mobius Labs GmbH
59
Text Generation
Transformers
mixtral
Mixture of Experts
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
main
Mixtral-8x7B-Instruct-v0.1-hf-attn-4bit-moe-2bit-HQQ
/
README.md
Commit History
Librarian Bot: Add moe tag to model (
#1
)
5ee71b6
mobicham
librarian-bot
commited on
Jan 8, 2024
Update README.md
4bf2205
mobicham
commited on
Dec 18, 2023
Update README.md
80be5fa
mobicham
commited on
Dec 18, 2023
Create README.md
49d95b0
mobicham
commited on
Dec 15, 2023