Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
yeonseok-zeticai 
posted an update Jun 25
Post
3181
🚀 Real-Time On-Device AI Agent with Polaris-4B — Run It Yourself, No Cloud, No Cost

We just deployed a real-time on-device AI agent using the Polaris-4B-Preview model — one of the top-performing <6B open LLMs on Hugging Face.

📱 What’s remarkable?
This model runs entirely on a mobile device, without cloud, and without any manual optimization. It was built using ZETIC.MLange, and the best part?

➡️ It’s totally automated, free to use, and anyone can do it.
You don’t need to write deployment code, tweak backends, or touch device-specific SDKs. Just upload your model — and ZETIC.MLange handles the rest.

🧠 About the Model
- Model: Polaris-4B-Preview
- Size: ~4B parameters
- Ranking: Top 3 on Hugging Face LLM Leaderboard (<6B)
- Tokenizer: Token-incremental inference supported
- Modifications: None — stock weights, just optimized for mobile

⚙️ What ZETIC.MLange Does
ZETIC.MLange is a fully automated deployment framework for On-Device AI, built for AI engineers who want to focus on models — not infrastructure.

Here’s what it does in minutes:
- 📊 Analyzes model structure
- ⚙️ Converts to mobile-optimized format (e.g., GGUF, ONNX)
- 📦 Generates a runnable runtime environment with pre/post-processing
- 📱 Targets real mobile hardware (CPU, GPU, NPU — including Qualcomm, MediaTek, Apple)
- 🎯 Gives you a downloadable SDK or mobile app component — ready to run
And yes — this is available now, for free, at https://mlange.zetic.ai

🧪 For AI Engineers Like You, If you want to:
- Test LLMs directly on-device
- Run models offline with no latency
- Avoid cloud GPU costs
- Deploy to mobile without writing app-side inference code

Then this is your moment. You can do exactly what we did, using your own models — all in a few clicks.

🎯 Start here → https://mlange.zetic.ai

📬 Want to try Polaris-4B on your own app? [email protected], or just visit https://mlange.zetic.ai , it is opened as free!

Great work @Chancy , @Zhihui , @tobiaslee !
In this post