Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>LLM Token CounterNew to Visual Studio Code? Get it now.
LLM Token Counter

LLM Token Counter

peti_poua

|
2 installs
| (0) | Free
Shows in the status bar the LLM token counts for your entire codebase, the current file and any selected text.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

LLM Token Counter

Shows in the status bar the LLM (Large Language Model) token counts for your entire codebase, the current file and any selected text.

Status bar screenshot

Features

This extension shows the token counts for:

  • Codebase: Click on the codebase token counter in the status bar to analyze your entire project.
  • Current File: Automatically shown in the status bar for the current file.
  • Selected Text: Select text to see its token count.

Supported Tokenizers

The extension supports two popular tokenizers:

  • OpenAI (OpenAI): Used for models like GPT-3.5, GPT-4, etc.
  • Anthropic: Used for Claude models

You can toggle between these tokenizers using the command palette.

Extension Settings

This extension has the following settings:

  • llmTokenCounter.defaultTokenizer: Set the default tokenizer (OpenAI or anthropic).
  • llmTokenCounter.showInStatusBar: Enable or disable showing token counts in the status bar.
  • llmTokenCounter.includePattern: Glob pattern for files to include when counting codebase tokens (default: **/*.{js,ts,jsx,tsx,py,java,c,cpp,h,hpp,cs,go,rs,php,rb,md,txt,json,yaml,yml,html,css,scss,less}).
  • llmTokenCounter.excludePattern: Glob pattern for files/folders to exclude from codebase token counting (default: **/node_modules/**).

You can configure these settings in your VS Code settings.json or through the extension settings UI.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft