Model Details
This is a AWQ GEMV quant of magnum-v3-34b: https://huggingface.co./anthracite-org/magnum-v3-34b
Model Description
Model has been quantized on 6xRTX4090, here are quantization parameters:
"zero_point": True, "q_group_size": 128, "w_bit": 4, "version": "GEMV"
- Downloads last month
- 7
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.
Model tree for andriadze/anthracite-magnum-v3-34b-awq-gemv
Base model
anthracite-org/magnum-v3-34b