Skip to content
| Marketplace
Sign in
Visual Studio Code>AI>EverRunNew to Visual Studio Code? Get it now.
EverRun

EverRun

hmadhsan

|
2 installs
| (0) | Free
Never let your coding flow stop.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

EverRun ⚡

A VS Code extension built for one goal: Never let your coding flow stop.


Features

Feature Description
Prompt Compression Automatically trims and compresses conversation history before sending to Claude
Smart Context Injection Detects relevant open files and injects only the useful parts
Conversation Summarisation Collapses old messages into dense summaries instead of dropping them
Streaming Responses Typewriter-effect streaming for immediate feedback
Save & Resume Conversations Persist any conversation to disk and reload it later
Task Notes Attach free-text notes to a session so you remember where you left off
Settings Panel Full configuration UI inside the extension

Quick Start

1. Install

# From VSIX (after packaging):
code --install-extension claude-token-optimizer-0.1.0.vsix

Or press F5 in VS Code with this folder open to launch the Extension Development Host.

2. Set your API Key

  • Open the Command Palette → EverRun: EverRun Settings
  • Or click the ⚡ Claude status bar item → gear icon in the panel

You need an Anthropic API key.

3. Send your first prompt

  • Select code in any editor
  • Press Ctrl+Shift+C (Cmd+Shift+C on macOS), or right-click → Optimize Prompt
  • The Chat Panel opens with your selection pre-filled
  • Type your question and hit Send

Commands

Command Keybinding Description
EverRun: Optimize Prompt Ctrl+Shift+C Open chat panel (pre-fills editor selection)
EverRun: Open EverRun Chat Panel — Open the chat panel directly
EverRun: Save Conversation — Save current history to disk
EverRun: Load Conversation — Pick and restore a saved conversation
EverRun: Clear Conversation History — Wipe in-memory history
EverRun: EverRun Settings — Open the VS Code settings page

Configuration

All settings are under the claudeTokenOptimizer namespace:

Setting Default Description
apiKey "" Your Anthropic API key
model claude-3-5-sonnet-20241022 Model to use
maxTokens 4096 Max response tokens
maxHistoryMessages 10 History window size
enableContextCompression true Toggle compression
enableFileContext true Attach open files
summarizeOldMessages true Summarise vs. drop old messages
maxContextFileLines 100 Lines per context file
systemPrompt see default System prompt sent with every request
saveConversationsPath "" Override conversations save directory

How Compression Works

User Prompt + Conversation History
         │
         ▼
  ┌─────────────────────────────┐
  │  1. Trim history window     │  → keep only the N newest messages
  │  2. Summarise older half    │  → ask Claude to condense, not drop
  │  3. Attach relevant files   │  → keyword-scored, capped at N lines
  │  4. Build final message     │  → prompt + code_context block
  └─────────────────────────────┘
         │
         ▼
    Claude API (streaming)
         │
         ▼
    Typewriter response in VS Code panel

Token savings are shown in the stats bar after every response.


Development

# Install dependencies
npm install

# Compile TypeScript
npm run compile

# Watch mode (recompiles on save)
npm run watch

# Run in development (press F5 in VS Code)
# Package as VSIX
npm run package

Project Structure

src/
├── extension.ts                 # Entry point, command registration
├── commands/
│   └── optimizePrompt.ts        # "Optimize Prompt" command logic
├── services/
│   ├── claudeService.ts         # Anthropic SDK wrapper + token estimation
│   ├── contextCompressor.ts     # Compression pipeline
│   └── conversationManager.ts  # Save / load / history management
├── panels/
│   └── chatPanel.ts             # WebView panel (HTML + message bridge)
└── config/
    └── settings.ts              # Typed settings accessors

Privacy

  • Your API key is stored in VS Code's global settings (not transmitted anywhere except Anthropic).
  • Conversations saved to disk stay local; no telemetry is collected.

License

MIT

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft