mlx-community/DeepSeek-R1-Distill-Qwen-14B-abliterated-v2-Q4-mlx Text Generation • 2B • Updated Jan 29 • 52 • 2
mlx-community/Qwen2.5-QwQ-35B-Eureka-Cubed-abliterated-uncensored-4bit Text Generation • 5B • Updated Mar 8 • 53 • 1
mlx-community/meta-llama-Llama-4-Scout-17B-16E-fp16 Text Generation • 108B • Updated Apr 6 • 1.15k • 3
mlx-community/Llama-3.2-8X4B-MOE-V2-Dark-Champion-Instruct-uncensored-abliterated-21B-MLX Text Generation • 21B • Updated Jun 25 • 193 • 1