An AI-powered VS Code extension that provides real-time grammar, spelling, and style checking for Markdown and plaintext files — like having a professional proofreader built into your editor.
Features
- 🔍 Real-time Background Scanning — Automatically checks your prose as you type with configurable debounce (default 1500ms)
- 🌊 Wavy Underline Diagnostics — Issues are highlighted directly in the editor with severity-colored wave underlines
- ⚡ Quick Fixes — Click on any highlighted issue to see the suggested replacement and apply it with one click
- 📊 Diff Preview — Before applying a fix, see a clear side-by-side comparison of the original text and the suggested correction
- 🔌 Multi-Provider Support — Works with OpenAI (GPT-4o), Anthropic (Claude 3.5 Sonnet), and local models via Ollama
- 🔒 Secure API Key Storage — API keys are stored in VS Code's encrypted SecretStorage, never in config files
- 📄 Full Document Scanning — Large documents are split into paragraph-based chunks, each scanned sequentially to cover the entire file
Demo
Wavy underlines highlight grammar and style issues directly in the editor
Hover over a wavy underline to see detailed error information
Click on an issue to see suggested quick fixes
Getting Started
1. Set Your API Key
Run the command: QuillAI: Set API Key
Or use the Command Palette (Cmd+Shift+P) → "QuillAI: Set API Key"
Open VS Code Settings (Cmd+,) and search for "QuillAI":
| Setting |
Default |
Description |
quillai.provider |
openai |
LLM provider: openai, anthropic, or ollama |
quillai.model |
gpt-4o |
Model name (e.g., gpt-4o, claude-3-5-sonnet-20241022, llama3) |
quillai.endpoint |
(auto) |
Custom API endpoint URL |
quillai.debounceMs |
1500 |
Delay in ms before scanning after typing stops |
quillai.maxChars |
5000 |
Max characters per scan (larger docs use paragraph extraction) |
quillai.enabled |
true |
Enable/disable automatic background scanning |
quillai.diagnosticSeverity |
warning |
Default severity for issues |
quillai.language |
auto |
Proofreading language/locale (e.g. en-US, en-GB, zh-CN, zh-HK, zh-MO, zh-SG) or auto |
quillai.systemPrompt |
(built-in) |
Custom system prompt for the LLM |
3. Start Writing
Open any .md, .txt, or .tex file. The extension will automatically scan your text and highlight issues. For large documents, it splits the text into paragraph-based chunks and scans each one to cover the entire file.
Commands
| Command |
Description |
QuillAI: Check Current Document |
Manually trigger a full document scan |
QuillAI: Set API Key |
Set or update your API key securely |
QuillAI: Set Proofreading Language |
Choose a specific language/locale (or auto-detect) |
QuillAI: Clear All Diagnostics |
Remove all highlighted issues |
Using with Ollama (Local Models)
- Install and start Ollama
- Pull a model:
ollama pull llama3
- Set provider to
ollama in settings
- Set model to
llama3 (or your preferred model)
- API Key is not required for local models
Requirements
- VS Code 1.118.0 or higher
- An API key for OpenAI or Anthropic (unless using Ollama locally)
Privacy & Security
- API keys are stored exclusively in VS Code's
SecretStorage (encrypted)
- API keys are never written to configuration files or logged
- Text is sent to the configured LLM provider for analysis only
- No telemetry or data collection by this extension
Calling out known issues can help limit users opening duplicate issues against your extension.
Following extension guidelines
Ensure that you've read through the extensions guidelines and follow the best practices for creating your extension.
Working with Markdown
You can author your README using Visual Studio Code. Here are some useful editor keyboard shortcuts:
- Split the editor (
Cmd+\ on macOS or Ctrl+\ on Windows and Linux).
- Toggle preview (
Shift+Cmd+V on macOS or Shift+Ctrl+V on Windows and Linux).
- Press
Ctrl+Space (Windows, Linux, macOS) to see a list of Markdown snippets.
Enjoy!