Olaf Kowalsky
Olafangensan
AI & ML interests
None yet
Recent Activity
reacted
to
ccocks-deca's
post
with 🔥
3 days ago
12 hours ago:
```
Something big* coming
*big = biggest in the world
```
Annnnnd... here it is! https://huggingface.co/deca-ai/3-alpha-ultra —the largest AI model in the world by https://huggingface.co/deca-ai, clocking in at a whopping 4.6T parameters. Apologies for the delay, but we’re stoked to finally drop this, even in its alpha stage. Before you dive in, here are a few things to keep in mind:
1. **No commercial use yet**: We're still working on Deca 2.5 (Proprietary), and releasing Deca 3 for commercial use right now would impact that. Once Deca 3.5 hits in early '26, we’ll be opening it up with a more permissive license.
2. **Built on existing models**: Deca 3 isn’t a ground-up creation—it’s a huge step forward, building on what’s already out there.
3. **It’s experimental**: As much as we’re hyped about its scale, it’s still in testing.
4. **DynaMoE architecture**: Run a (very) small part of the model with 64GB of RAM/VRAM (when quantized - quants coming soon), or the whole thing with 1TB. It’s that scalable.
5. **Not widely supported yet**: Frameworks like vLLM and Transformers aren’t compatible with Deca 3 at the moment, so until we drop the DynaMoE software (beta coming soon), it’s mostly just a concept.
We’re super excited to see what you do with it once the full setup’s ready. Hang tight, and stay tuned!
new activity
13 days ago
bartowski/huizimao_gpt-oss-120b-uncensored-bf16-GGUF:Why are the file sizes so similar?
new activity
about 1 month ago
bartowski/ai21labs_AI21-Jamba-Mini-1.7-GGUF:Is the prompt format correct?
Organizations
None yet