GPTQ quantization of https://huggingface.co./KoboldAI/PPO_Pygway-6b-Mix

Using this repository: https://github.com/mayaeary/GPTQ-for-LLaMa/tree/gptj-v2

Command:

python3 gptj.py models/PPO_Pygway-6b-Mix c4 --wbits 4 --groupsize 128 --save_safetensors models/PPO_Pygway-6b-Mix-4bit-128g.safetensors
Downloads last month
13
Inference Examples
Inference API (serverless) has been turned off for this model.