ReactorAI

Chat with Ollama and Gemini AI models on macOS, iOS, and Windows

Buy Me a Coffee

Like ReactorAI? Support future updates ❤️

Frequently Asked Questions

Is ReactorAI free to use?

Yes! All core ReactorAI features are completely free - including AI chat, model switching, session management, and more. Future advanced features may include optional premium options, but the essential functionality will always remain free.

Will ReactorAI always be free?

The core features of ReactorAI will always remain free. We may introduce optional premium features in the future, but your ability to chat with AI models, manage sessions, and use the essential functionality will never require payment.

Do I need to install Ollama separately?

For Ollama models, yes. ReactorAI can connect to Ollama servers running locally or remotely. For Gemini models, you'll need a Gemini API key instead.

Which platforms does ReactorAI support?

ReactorAI is available for macOS, iOS, and Windows via their respective app stores.

Can I use my own models with ReactorAI?

Yes. For Ollama, any model accessible through your Ollama server can be used. For Gemini, you can access Google's Gemini models through the API.

Does ReactorAI store my conversations?

All data is stored locally on your device. You can backup or delete sessions anytime through the interface.

How do I connect to a remote server or Gemini?

In the app settings, you can replace the default server URL with your remote Ollama server's address, or configure your Gemini API key for Google's models.

What is Model Context Protocol (MCP) support?

MCP is an upcoming feature that will allow AI models to interact with external tools and data sources. ReactorAI will first support file system MCP (letting AI access your local files) and later allow custom MCP server connections for advanced integrations.