Quill lets you integrate AI writing assistance (Ollama, OpenAI, or Claude) to help you write or respond to emails un Mozilla Thunderbird.
Process your emails with AI directly in Thunderbird
Summarize • Translate • Correct • Reply • Chat
Features • Providers • Installation • Configuration • Usage • FAQ
| Action | Description |
|---|---|
| Summarize | Transform long emails into concise bullet points |
| Translate FR | Translate to French |
| Translate EN | Translate to British English |
| Correct FR | Fix spelling and grammar (French) |
| Correct EN | Fix spelling and grammar (English) |
| Classify | Analyze tone: politeness, warmth, formality |
| Rewrite Polite | Make text more polite |
| Rewrite Formal | Make text more formal |
| Reply | Generate a draft response |
| Custom Prompt | Execute your own instructions |
- 💬 Interactive Chat - Continue conversations with AI
- 📝 Direct Insert - Insert responses directly into your email
- 🔄 Regenerate - Get a new response with one click
- ⚙️ Custom Actions - Add your own prompts
Quill supports 3 AI providers - choose based on your needs:
| Provider | Cost | Privacy | Speed | Best For |
|---|---|---|---|---|
| 🟠 Anthropic (Claude) | Pay-per-use | Cloud | Fast | Best quality |
| 🟢 OpenAI (GPT) | Pay-per-use | Cloud | Fast | Large ecosystem |
| 🔵 Ollama (Local) | Free | 100% Local | Varies | Privacy-focused |
- Download
quill-1.2.0.xpifrom Releases - In Thunderbird: Menu ☰ → Add-ons and Themes (or
Ctrl+Shift+A) - Click ⚙️ → Install Add-on From File...
- Select the
.xpifile and click Add
git clone https://github.com/mikecastrodemaria/Quill.git
cd Quill/pluginThen in Thunderbird: Menu ☰ → Add-ons → ⚙️ → Debug Add-ons → Load Temporary Add-on → select manifest.json
Best for: High-quality responses, complex tasks
- Create account at console.anthropic.com
- Go to API Keys → Create Key
- Copy your key (format:
sk-ant-api03-...) - In Quill settings:
- Provider: Anthropic (Claude)
- Paste your API key
- Recommended model: Claude Sonnet 4.5
Pricing: ~$3/million input tokens, ~$15/million output tokens
Best for: GPT ecosystem users, fast responses
- Create account at platform.openai.com
- Go to API Keys → Create new secret key
- Copy your key (format:
sk-proj-...) - In Quill settings:
- Provider: OpenAI (GPT)
- Paste your API key
- Recommended model: GPT-4o
Pricing: ~$2.50/million input tokens, ~$10/million output tokens
Best for: Privacy, offline use, no API costs
🍎 macOS
# Download from https://ollama.ai or use Homebrew:
brew install ollama🐧 Linux
curl -fsSL https://ollama.ai/install.sh | sh🪟 Windows
Download installer from ollama.ai/download
# Recommended for email tasks:
ollama pull llama3
# Other good options:
ollama pull mistral # Faster, lighter
ollama pull qwen2.5 # Good multilingual
ollama pull mixtral # More powerful (needs 32GB+ RAM)Thunderbird extensions require CORS headers. Configure Ollama:
🍎 macOS - Method 1: Manual Launch
# Quit Ollama app first (Menu bar → Quit)
OLLAMA_ORIGINS="*" ollama serveKeep terminal open while using Quill.
🍎 macOS - Method 2: Permanent (Recommended)
# Set environment variable
launchctl setenv OLLAMA_ORIGINS "*"Then restart your Mac. After restart, Ollama app will work normally with CORS enabled.
Alternative - Create launch agent:
cat > ~/Library/LaunchAgents/com.ollama.env.plist << 'EOF'
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.ollama.env</string>
<key>ProgramArguments</key>
<array>
<string>sh</string>
<string>-c</string>
<string>launchctl setenv OLLAMA_ORIGINS "*"</string>
</array>
<key>RunAtLoad</key>
<true/>
</dict>
</plist>
EOF
launchctl load ~/Library/LaunchAgents/com.ollama.env.plist🐧 Linux - Systemd Service
# Edit the service file
sudo systemctl edit ollama
# Add these lines:
[Service]
Environment="OLLAMA_ORIGINS=*"
# Restart service
sudo systemctl restart ollamaOr manual launch:
OLLAMA_ORIGINS="*" ollama serve🪟 Windows - Environment Variable
Option 1: PowerShell (Admin)
[Environment]::SetEnvironmentVariable("OLLAMA_ORIGINS", "*", "User")Then restart Ollama.
Option 2: GUI
- Search "Environment Variables" in Start Menu
- Click "Environment Variables..."
- Under "User variables", click New
- Name:
OLLAMA_ORIGINS - Value:
* - OK → Restart Ollama
# Should return your models list:
curl http://localhost:11434/api/tags
# Test with CORS header:
curl -H "Origin: moz-extension://test" http://localhost:11434/api/tags- In Quill settings:
- Provider: Ollama (Local)
- URL:
http://localhost:11434(default) - Select your downloaded model
- Open a compose window (New, Reply, or Forward)
- Select text to process (or leave empty for entire email)
- Click the Quill icon (toolbar, top right)
- Choose an action from the dropdown
- Wait for AI to process
- Insert result or Regenerate for a new response
- After getting a response, click "Convert to Chat"
- Continue the conversation with follow-up questions
- AI remembers context from your email
- Go to Quill Settings
- Scroll to Actions section
- Click Add an action
- Enter name and prompt
- Save
Example custom prompts:
Name: Simplify
Prompt: Rewrite this text using simpler words and shorter sentences.
Name: Extract Tasks
Prompt: Extract all action items and tasks from this email as a numbered list.
Name: Professional Tone
Prompt: Rewrite this maintaining the message but with a more professional business tone.
| Use Case | Anthropic | OpenAI | Ollama |
|---|---|---|---|
| Daily email | Claude 3.5 Haiku | GPT-4o-mini | llama3 / mistral |
| Complex tasks | Claude Sonnet 4.5 | GPT-4o | mixtral / qwen2.5 |
| Budget-conscious | Claude 3.5 Haiku | GPT-3.5-turbo | Any local model |
| Maximum quality | Claude 3 Opus | GPT-4 | llama3:70b |
The extension doesn't respond
- Check your API key is correct
- Verify you have credits (Anthropic/OpenAI)
- For Ollama: ensure service is running with CORS
- Check Thunderbird console: Menu → Tools → Developer Tools → Error Console
Ollama models don't appear in the list
CORS is not configured. See Ollama CORS Configuration.
Can I use Quill offline?
Yes, with Ollama! Local models work without internet. Anthropic and OpenAI require an internet connection.
Is my data secure?
- Ollama: 100% local, nothing leaves your computer
- Anthropic/OpenAI: Data sent to their servers for processing. Check their privacy policies.
- Quill itself: Collects no data. API keys stored locally in Thunderbird.
How much does it cost?
| Error | Solution |
|---|---|
401 Unauthorized |
Check API key is correct |
429 Rate Limited |
Wait and retry, or upgrade plan |
500 Server Error |
Provider issue, try later |
NetworkError |
Check internet / Ollama CORS |
# Check if Ollama is running:
curl http://localhost:11434/api/tags
# Check CORS headers:
curl -v http://localhost:11434/api/tags 2>&1 | grep -i "access-control"
# Restart Ollama with CORS:
pkill ollama
OLLAMA_ORIGINS="*" ollama serveContributions welcome!
- Fork the repository
- Create a branch:
git checkout -b feature/improvement - Commit changes:
git commit -m 'Add feature' - Push:
git push origin feature/improvement - Open a Pull Request
GPL-3.0 - See LICENSE
Fork of Aify by Ali Raheem.
- Development: Supersonique Studio SARL
- Original Project: Aify by Ali Raheem
- AI Providers: Anthropic, OpenAI, Ollama
Made with ❤️ by Supersonique Studio SARL