Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
AmpereComputing
/
qwen-3-a22b-235b-gguf
like
0
Follow
Ampere Computing
41
GGUF
conversational
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
qwen-3-a22b-235b-gguf
Ctrl+K
Ctrl+K
1 contributor
History:
20 commits
jangrzybek
Upload Qwen3-235B-A22B-128K-Q8R16-00009-of-00009.gguf with huggingface_hub
bdac558
verified
about 1 month ago
.gitattributes
Safe
3.02 kB
Upload Qwen3-235B-A22B-128K-Q8R16-00009-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q4_K_4-00001-of-00009.gguf
Safe
17.3 GB
xet
Upload Qwen3-235B-A22B-128K-Q4_K_4-00001-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q4_K_4-00002-of-00009.gguf
Safe
16.4 GB
xet
Upload Qwen3-235B-A22B-128K-Q4_K_4-00002-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q4_K_4-00003-of-00009.gguf
Safe
16.2 GB
xet
Upload Qwen3-235B-A22B-128K-Q4_K_4-00003-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q4_K_4-00004-of-00009.gguf
Safe
15.5 GB
xet
Upload Qwen3-235B-A22B-128K-Q4_K_4-00004-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q4_K_4-00005-of-00009.gguf
Safe
16.6 GB
xet
Upload Qwen3-235B-A22B-128K-Q4_K_4-00005-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q4_K_4-00006-of-00009.gguf
Safe
16.2 GB
xet
Upload Qwen3-235B-A22B-128K-Q4_K_4-00006-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q4_K_4-00007-of-00009.gguf
Safe
15.5 GB
xet
Upload Qwen3-235B-A22B-128K-Q4_K_4-00007-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q4_K_4-00008-of-00009.gguf
Safe
17 GB
xet
Upload Qwen3-235B-A22B-128K-Q4_K_4-00008-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q4_K_4-00009-of-00009.gguf
Safe
14.7 GB
xet
Upload Qwen3-235B-A22B-128K-Q4_K_4-00009-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q8R16-00001-of-00009.gguf
Safe
26.4 GB
xet
Upload Qwen3-235B-A22B-128K-Q8R16-00001-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q8R16-00002-of-00009.gguf
Safe
27.5 GB
xet
Upload Qwen3-235B-A22B-128K-Q8R16-00002-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q8R16-00003-of-00009.gguf
Safe
26.8 GB
xet
Upload Qwen3-235B-A22B-128K-Q8R16-00003-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q8R16-00004-of-00009.gguf
Safe
26 GB
xet
Upload Qwen3-235B-A22B-128K-Q8R16-00004-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q8R16-00005-of-00009.gguf
Safe
27.5 GB
xet
Upload Qwen3-235B-A22B-128K-Q8R16-00005-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q8R16-00006-of-00009.gguf
Safe
26.8 GB
xet
Upload Qwen3-235B-A22B-128K-Q8R16-00006-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q8R16-00007-of-00009.gguf
Safe
26 GB
xet
Upload Qwen3-235B-A22B-128K-Q8R16-00007-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q8R16-00008-of-00009.gguf
Safe
27.5 GB
xet
Upload Qwen3-235B-A22B-128K-Q8R16-00008-of-00009.gguf with huggingface_hub
about 1 month ago
Qwen3-235B-A22B-128K-Q8R16-00009-of-00009.gguf
Safe
22.6 GB
xet
Upload Qwen3-235B-A22B-128K-Q8R16-00009-of-00009.gguf with huggingface_hub
about 1 month ago
README.md
Safe
3.22 kB
Create README.md
about 1 month ago