Frequently Asked Questions
Do I need to install Ollama separately?
Yes. ReactorAI is a client for Ollama, so you must have the Ollama server running locally or have access to a remote one.
Which platforms does ReactorAI support?
ReactorAI is available for macOS, iOS, and Windows via their respective app stores.
Can I use my own models with ReactorAI?
Yes. As long as the model is accessible through your Ollama server, you can select and chat with it through the app.
Does ReactorAI store my conversations?
All data is stored locally on your device. You can backup or delete sessions anytime through the interface.
How do I connect to a remote server?
In the app settings, replace the default server URL with your remote Ollama server’s address and port.