Skip to content

ondeinference/onde

Repository files navigation

Onde Inference

Onde Inference

On-device LLM inference — optimized for Apple silicon.

crates.io Swift SDK pub.dev Website App Store

Swift SDK · Flutter SDK · Website


In production

Onde powers live App Store apps with fully on-device chat — no server, no latency, no data leaving the device.

Download on the App Store


License

Onde is dual-licensed under MIT and Apache 2.0. You may use it under either license at your option.

Dependency attribution

Dependency License Author
mistral.rs MIT Eric Buehler
UniFFI MPL-2.0 Mozilla
tokio MIT Tokio contributors

Model licenses

Models downloaded by Onde have their own licenses independent of this crate. By using Onde, you are also subject to the license of the model you load:

Model License
Qwen 2.5 1.5B / 3B Instruct Qwen Community License
Qwen 2.5 Coder 1.5B / 3B / 7B Qwen Community License

© 2026 Onde Inference