view article Article M2.1: Multilingual and Multi-Task Coding with Strong Generalization 5 days ago • 27
view post Post 3578 2025.1 - DeepSeek entered the scene, backed by High Flyer Quant2026.1 - IQuest enters the game, backed by Uniquant Quant 📈 and launching IQuest-Coder on huggingfacehttps://huggingface.co/collections/IQuestLab/iquest-coder✨ 40B models: Instruct / Thinking / Loop✨ Loop = MoE-level performance with only ~5% extra training cost✨ Native 128K context See translation 1 reply · 👍 6 6 + Reply
view post Post 3319 I have update my https://huggingface.co/collections/MohamedRashad/arabic-speech-datasetswith new datasets, making the full audio data more than 3000 hours of good arabic speech.Feel Free to use it in your new innovations, And happy new year! See translation ❤️ 10 10 + Reply
view post Post 5540 Thank you @clem (Co-Founder & CEO of Hugging Face) for sharing my dataset on X / Twitter! ronantakizawa/github-top-developers#github #dataset See translation 4 replies · 👍 11 11 ❤️ 3 3 👀 2 2 😎 1 1 + Reply
view post Post 5396 NVIDIA releases Nemotron 3 Nano, a new 30B hybrid reasoning model! 🔥Has 1M context window & best in class performance for SWE-Bench, reasoning & chat. Run the MoE model locally with 24GB RAM.GGUF: unsloth/Nemotron-3-Nano-30B-A3B-GGUF💚 Step-by-step Guide: https://docs.unsloth.ai/models/nemotron-3 See translation 1 reply · 🔥 13 13 ❤️ 7 7 🤗 4 4 👍 1 1 + Reply