Configure your AI learning experience
Run LLMs locally - free, private, requires Ollama installation
Use different models for different tasks: a smarter model for in-depth content generation, and a faster/cheaper model for chat. Leave disabled to use the same provider for everything.
Enable the AI to search the web and cite sources in answers.
🔒 Stored locally - used for web search when enabled in chat
Your data stays yours. Here's exactly what happens:
Stored only in your browser's local storage. Never sent to our servers.
Generated on-the-fly by your AI provider. We don't store or collect it.
Completed topics and learning path stored to sync across devices.
Processed directly by your AI provider. Not stored on our servers.