Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
amd
/
Mistral-7B-Instruct-v0.3-awq-g128-int4-asym-fp16-onnx-hybrid
like
0
Follow
AMD
1.09k
ONNX
License:
apache-2.0
Model card
Files
Files and versions
Community
main
Mistral-7B-Instruct-v0.3-awq-g128-int4-asym-fp16-onnx-hybrid
2 contributors
History:
11 commits
uday610
Create config.json
f8e1493
verified
12 days ago
.gitattributes
1.59 kB
Upload 9 files
26 days ago
Mistral-7B-Instruct-v0.3_jit.bin
3.87 GB
LFS
Upload 9 files
26 days ago
Mistral-7B-Instruct-v0.3_jit.onnx
294 kB
LFS
Upload 9 files
26 days ago
Mistral-7B-Instruct-v0.3_jit.onnx.data
3.97 GB
LFS
Upload 9 files
26 days ago
Mistral-7B-Instruct-v0.3_jit.pb.bin
7.7 kB
LFS
Upload 9 files
26 days ago
README.md
3 kB
Update README.md
12 days ago
config.json
2 Bytes
Create config.json
12 days ago
genai_config.json
1.74 kB
Upload 9 files
26 days ago
special_tokens_map.json
551 Bytes
Upload 9 files
26 days ago
tokenizer.json
3.67 MB
Upload 9 files
26 days ago
tokenizer.model
587 kB
LFS
Upload 9 files
26 days ago
tokenizer_config.json
141 kB
Upload 9 files
26 days ago