Local LLM Chat
Settings
API Type
Ollama
OpenAI-Compatible
Server URL
⚠️ Your server must be configured to accept requests from this page (CORS).
API Key (optional)
Model Name
Temperature
Top P
Top K
Seed
Max Tokens
Stream Response
System Prompt (optional)
Close
Save
Cancel
OK