Falcon-H1: A Family of Hybrid-Head Language Models Redefining Efficiency and Performance Paper • 2507.22448 • Published 26 days ago • 64
view article Article Announcing NeurIPS 2025 E2LM Competition: Early Training Evaluation of Language Models By tiiuae and 8 others • Jul 4 • 9
RADIO Collection A collection of Foundation Vision Models that combine multiple models (CLIP, DINOv2, SAM, etc.). • 14 items • Updated 6 days ago • 24
Falcon-H1 Collection Falcon-H1 Family of Hybrid-Head Language Models (Transformer-SSM), including 0.5B, 1.5B, 1.5B-Deep, 3B, 7B, and 34B (pretrained & instruction-tuned). • 38 items • Updated 25 days ago • 52
view article Article Falcon-H1: A Family of Hybrid-Head Language Models Redefining Efficiency and Performance By tiiuae and 5 others • May 21 • 34
Running on Zero 16 16 Falcon3 Mamba 7b Instruct Playground 🐍 Chat with a language model about any topic
Running 3.11k 3.11k The Ultra-Scale Playbook 🌌 The ultimate guide to training LLM on large GPU Clusters