Ollama Translate
AI-Powered Translation
{{ connectionStatus }}
⚙️
From
Detect Language
{{ lang.name }}
⇄
To
Add target language
{{ lang.name }}
Manual
Live
Style
Add writing style
{{ style.name }}
{{ getLanguageName(lang) }}
×
{{ getStyleName(style) }}
×
Translating...
{{ getLanguageName(translation.language) }}
{{ getStyleName(translation.style) }}
{{ translation.text }}
Translation will appear here...
{{ translationMode === 'live' ? '⚡ Live mode: Start typing to translate automatically' : '👆 Manual mode: Type text and click Translate button' }}
{{ inputText.length }} / {{ maxLength }}
★
Clear
{{ isLoading ? 'Translating...' : (translationMode === 'live' ? '⚡ Translate Now' : '🔄 Translate') }}
Recent Translations
Clear
No recent translations
{{ item.source }}
{{ item.translations.map(t => `${getLanguageName(t.language)}: ${t.text}`).join(' • ') }}
Styles: {{ item.styles.map(s => getStyleName(s)).join(', ') }}
Favorites
Clear
No favorite translations
{{ item.source }}
{{ item.translations.map(t => `${getLanguageName(t.language)}: ${t.text}`).join(' • ') }}
Styles: {{ item.styles.map(s => getStyleName(s)).join(', ') }}
Settings
×
API Provider
Select API Provider
Ollama (Local)
OpenAI
Choose between local Ollama or OpenAI API
Ollama Server
Server URL
URL of your Ollama server
Model
Select a model
{{ model.name }}
Connect to server to load available models
{{ availableModels.length }} models available
OpenAI Configuration
API Key
Your OpenAI API key (kept locally). Get one at
platform.openai.com
Base URL
OpenAI API base URL (or compatible endpoint)
Model
Select a model
{{ model.name }} - {{ model.description }}
Test connection to load your account's available models
{{ openaiModels.length }} chat models available in your account
💡 Note:
OpenAI API requires a paid account with available credits. Check your usage and billing at
platform.openai.com/usage
Connection Test
{{ testingConnection ? 'Testing...' : 'Test Connection' }}
{{ connectionTestResult.message }}
Test your API connection
Model Parameters
Context Size (num_ctx)
Context window size for the model
Temperature
Controls randomness (0.0 = deterministic, 1.0 = creative)
Top P
Nucleus sampling threshold
Max Tokens
Maximum tokens in response
Advanced Options (JSON)
Additional Parameters
Additional Ollama parameters as JSON object
Active Languages
Select languages to show in the language selector
{{ lang.name }} ({{ lang.code }})
Active Writing Styles
Select writing styles to show in the style selector
{{ style.name }}
Cancel
Save Settings