Quantized models (4bit) request

#4
by terminator33 - opened

Can we please get a 4bit quantized model!!

LCPP does not support this model yet. I have an issue open with them on their github. I will be quantizing it as soon as it's resolved, and I have to assume others will as well.

Cohere For AI org

We released a 4 bit quantized model here today: https://huggingface.co./CohereForAI/c4ai-command-r-plus-4bit. Enjoy!

sarahooker changed discussion status to closed

Sign up or log in to comment