DavidAU/Mistral-MOE-4X7B-Dark-MultiVerse-Uncensored-Enhanced32-24B-gguf Text Generation • 24B • Updated 28 days ago • 9.78k • 82
Language Learning - Moe - 12 GB GPU + 32 GB RAM Collection Force Model Experts Weights onto CPU • 2 items • Updated 5 days ago