New fixed quants? [BPE support]
#5
by
RachidAR
- opened
Since BPE tokenizer support is now merged: (https://github.com/ggerganov/llama.cpp/pull/6920). Re-quant is required for best model quality.
- Thanks in advance
Yup, it's in the process right now :)
RachidAR
changed discussion status to
closed