Configure your LLM API connection. Settings are saved in your browser's local storage.
Enter the base URL. E.g., `http://localhost:11434`
⚠️ Your Ollama server may need to be configured to accept requests from this page (CORS). See documentation on setting OLLAMA_ORIGINS.
Enter the base URL. E.g.,
`http://localhost:8080`
For Ollama: `gemma3:270m`. For OpenAI-compatible: model file path or identifier.
Use `{{text}}`, `{{source_language}}`, and `{{destination_language}}` as placeholders.