Shows in the status bar the LLM (Large Language Model) token counts for your entire codebase, the current file and any selected text.
Features
This extension shows the token counts for:
Codebase: Click on the codebase token counter in the status bar to analyze your entire project.
Current File: Automatically shown in the status bar for the current file.
Selected Text: Select text to see its token count.
Supported Tokenizers
The extension supports two popular tokenizers:
OpenAI (OpenAI): Used for models like GPT-3.5, GPT-4, etc.
Anthropic: Used for Claude models
You can toggle between these tokenizers using the command palette.
Extension Settings
This extension has the following settings:
llmTokenCounter.defaultTokenizer: Set the default tokenizer (OpenAI or anthropic).
llmTokenCounter.showInStatusBar: Enable or disable showing token counts in the status bar.
llmTokenCounter.includePattern: Glob pattern for files to include when counting codebase tokens (default: **/*.{js,ts,jsx,tsx,py,java,c,cpp,h,hpp,cs,go,rs,php,rb,md,txt,json,yaml,yml,html,css,scss,less}).
llmTokenCounter.excludePattern: Glob pattern for files/folders to exclude from codebase token counting (default: **/node_modules/**).
You can configure these settings in your VS Code settings.json or through the extension settings UI.