Your most sensitive notes, your private thoughts — other AI plugins send them to someone's servers. Private AI doesn't.
- ✅ Your notes stay on your device
- ✅ Use local LLMs like gpt-oss, Qwen3, Gemma3, DeepSeek, Mistral and many more, locally on your own hardware
- ✅ No data is transmitted to external services
- ✅ No analytics or tracking
Compatible With: LM Studio
- Goto Community Plugins
- Search
Private AIand click Install - Click Enable
Enjoy!
- Easy Setup: Easy setup and model swapping with LM Studio
- Cross Platform: Supports for most modern Mac and Windows machines
- Integrated Vault Search: Automatically search your Obsidian vault for relevant information to provide contextual responses and cite specific notes from your vault
- Open Tab Context: Focus your conversation on specific notes for focused insights
- Performance Tuning: Customize models, search parameters, token limits, and more to tune performance for your hardware
- "I have to have a difficult conversation with a friend, read my journal entry about the situation and help me come up with a good way to talk to my friend compassionately"
- "My partner was a jerk to me, help me look at other alternative reasons why they may have reacted the way they did"
- "Talk to this journal entry like a good friend"
- "When did I first meet Jacob?"
- "What did frank talk about at our meeting on Obsidian plugins?"
- "What did I write about machine learning?"
- "What are my thoughts on productivity systems?"
Contributions are welcome! Please feel free to submit issues and pull requests.
This project respects and is compatible with the original licenses of all code and dependencies used:
- esbuild - MIT License - Used for bundling the plugin
- TypeScript - Apache-2.0 License - Used for type safety
- Obsidian API - MIT License - Official Obsidian plugin API
All development dependencies are used under their respective open-source licenses (MIT, Apache-2.0, ISC, BSD) and are properly externalized in the build process.
This plugin integrates with local LLM services but does not include any of their code:
- LM Studio - Proprietary - Local LLM interface
This project is licensed under the MIT License.
