Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
VoxCPM
Log In
Sign Up
Mungert
/
SmallThinker-4BA0.6B-Instruct
like
0
Text Generation
GGUF
English
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
302a204
SmallThinker-4BA0.6B-Instruct
47.6 GB
1 contributor
History:
15 commits
Mungert
Upload SmallThinker-4BA0.6B-Instruct-q5_0_l.gguf with huggingface_hub
302a204
verified
about 2 months ago
.gitattributes
2.45 kB
Upload SmallThinker-4BA0.6B-Instruct-q5_0_l.gguf with huggingface_hub
about 2 months ago
README.md
Safe
5.7 kB
Upload README.md with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-bf16.gguf
Safe
8.55 GB
xet
Upload SmallThinker-4BA0.6B-Instruct-bf16.gguf with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-bf16_q4_k.gguf
4.87 GB
xet
Upload SmallThinker-4BA0.6B-Instruct-bf16_q4_k.gguf with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-f16.gguf
8.55 GB
xet
Upload SmallThinker-4BA0.6B-Instruct-f16.gguf with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-f16_q4_k.gguf
4.87 GB
xet
Upload SmallThinker-4BA0.6B-Instruct-f16_q4_k.gguf with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-imatrix.gguf
Safe
16.8 MB
xet
Upload SmallThinker-4BA0.6B-Instruct-imatrix.gguf with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-q3_k_l.gguf
2.48 GB
xet
Upload SmallThinker-4BA0.6B-Instruct-q3_k_l.gguf with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-q4_0_l.gguf
2.65 GB
xet
Upload SmallThinker-4BA0.6B-Instruct-q4_0_l.gguf with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-q4_k_l.gguf
2.88 GB
xet
Upload SmallThinker-4BA0.6B-Instruct-q4_k_l.gguf with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-q5_0_l.gguf
3.12 GB
xet
Upload SmallThinker-4BA0.6B-Instruct-q5_0_l.gguf with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-q5_1.gguf
Safe
3.21 GB
xet
Upload SmallThinker-4BA0.6B-Instruct-q5_1.gguf with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-q5_k_l.gguf
3.28 GB
xet
Upload SmallThinker-4BA0.6B-Instruct-q5_k_l.gguf with huggingface_hub
about 2 months ago
SmallThinker-4BA0.6B-Instruct-q5_k_s.gguf
Safe
3.13 GB
xet
Upload SmallThinker-4BA0.6B-Instruct-q5_k_s.gguf with huggingface_hub
about 2 months ago