Skip to content
| Marketplace
Sign in
Visual Studio Code>Machine Learning>Neurometric - AI Inference OptimizerNew to Visual Studio Code? Get it now.
Neurometric - AI Inference Optimizer

Neurometric - AI Inference Optimizer

Neurometric

|
2 installs
| (0) | Free
Optimize your LLM usage with cost analysis, model recommendations, and side-by-side comparisons
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Neurometric VS Code Extension

Optimize your LLM usage with cost analysis, model recommendations, and side-by-side comparisons directly in VS Code.

Features

Cost Analysis

  • Inline CodeLens: See cost estimates directly above LLM API calls
  • Analyze Current File: Get a complete cost breakdown for all detected API calls
  • Token Counting: Automatic token estimation for prompts

Model Recommendations

  • Task-Based Recommendations: Get model suggestions based on your task type
  • Constraint-Aware: Recommendations consider your budget and latency requirements
  • Alternative Options: See ranked alternatives with tradeoff explanations

Arena Mode

  • Side-by-Side Comparison: Compare multiple models on the same prompt
  • Real-Time Metrics: See cost, latency, and token usage for each response
  • Easy Integration: Copy responses or insert directly at cursor

MCP Server Integration

The extension includes an MCP (Model Context Protocol) server that enables AI assistants to:

  • Analyze costs for prompts
  • Get model recommendations
  • Compare models programmatically
  • Access pricing information

Installation

From VS Code Marketplace

  1. Open VS Code
  2. Go to Extensions (Ctrl+Shift+X / Cmd+Shift+X)
  3. Search for "Neurometric"
  4. Click Install

From Source

cd packages/vscode-extension
npm install
npm run compile

Then press F5 to launch the Extension Development Host.

Usage

Keyboard Shortcuts

  • Ctrl+Shift+N / Cmd+Shift+N: Analyze costs in current file
  • Ctrl+Shift+A / Cmd+Shift+A: Open Arena Mode
  • Ctrl+Shift+R / Cmd+Shift+R: Get model recommendation

Commands

Access via Command Palette (Ctrl+Shift+P / Cmd+Shift+P):

  • Neurometric: Analyze Cost
  • Neurometric: Open Arena Mode
  • Neurometric: Get Model Recommendation
  • Neurometric: Sign In

Context Menu

Right-click in the editor to:

  • Analyze Cost at Cursor
  • Compare Selection with Multiple Models

Configuration

Setting Default Description
neurometric.showCostInline true Show cost estimates via CodeLens
neurometric.defaultModels ["gpt-4o", "claude-3-5-sonnet", "gemini-1.5-pro"] Models for Arena comparisons
neurometric.autoAnalyze false Auto-analyze on file save
neurometric.costThreshold 0.01 Highlight costs above this (USD)
neurometric.apiEndpoint https://api.neurometric.ai/v1 API endpoint (enterprise/self-hosted)

Supported Languages

The extension detects LLM API calls in:

  • Python
  • TypeScript
  • JavaScript
  • TypeScript React (TSX)
  • JavaScript React (JSX)

Detected Providers

  • OpenAI (GPT-4, GPT-4o, etc.)
  • Anthropic (Claude 3, Claude 3.5)
  • Google (Gemini 1.5)
  • Azure OpenAI
  • AWS Bedrock

MCP Server

The extension registers an MCP server that AI assistants can use. Available tools:

neurometric_analyze_cost

Analyze cost and token usage for a prompt.

neurometric_recommend_model

Get model recommendations based on task type.

neurometric_compare_models

Compare multiple models side-by-side.

neurometric_get_pricing

Get current pricing information.

Development

# Install dependencies
npm install

# Compile TypeScript
npm run compile

# Watch mode
npm run watch

# Run linting
npm run lint

# Package extension
npm run package

License

MIT

Links

  • Neurometric Website
  • Documentation
  • GitHub Repository
  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft