Skip to content
| Marketplace
Sign in
Visual Studio Code>AI>MD ChatNew to Visual Studio Code? Get it now.
MD Chat

MD Chat

AlexYuan

| (0) | Free
Chat with AI directly in Markdown files. The .md file IS the chat history.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

MD Chat

Your .md file IS the chat history.

Chat with AI directly inside Markdown files. No side panels, no separate apps — just write ## User, hit Ctrl+Enter, and the response streams right back into your document.

Why MD Chat?

  • File-first — prompts, responses, notes, and code all live in one .md file. Easy to diff, search, share, and version control.
  • 6 providers — Claude CLI, Codex CLI, OpenAI, Anthropic API, OpenRouter, Ollama. Use cloud APIs or run local models.
  • Streaming — AI replies stream into the document in real time. Cancel anytime with Escape.
  • Session continuity — CLI providers auto-save session IDs in frontmatter. Pick up where you left off.
  • @[read] context — inline file contents into your prompt with @[read](https://github.com/Ayjc/md-chat/blob/HEAD/src/main.ts). The AI sees the file, your .md stays clean.
  • Extract & Fork — pull code blocks into real files, or branch a conversation into a new chat.
  • Multi-file chats — run separate conversations in different files at the same time.
  • Per-file overrides — provider, model, system prompt, temperature — all configurable per file via frontmatter.

Quick Start

Option A: Use a CLI provider (no API key needed)

  1. Install Claude Code or Codex CLI
  2. Open any .md file and write:
## User
What is the capital of France?
  1. Press Ctrl+Enter — the response streams in as ## Assistant

Option B: Use an API provider

  1. Open VS Code Settings and search for mdChat
  2. Set your API key (e.g. mdChat.providers.openai.apiKey)
  3. Set mdChat.defaultProvider to openai (or anthropic, openrouter, ollama)
  4. Write ## User, press Ctrl+Enter

That's it. Keep adding ## User blocks to continue the conversation.

Providers

Provider Type Auth Session
claudeCli CLI Claude Code on PATH Stateful (auto-resumed)
codexCli CLI Codex CLI on PATH Stateful (auto-resumed)
openai API API key Stateless
anthropic API API key Stateless
openrouter API API key Stateless
ollama API None (local) Stateless

Fallback chain — if the default provider is unavailable, MD Chat automatically tries the next one in mdChat.fallbackProviders.

Per-file override — set provider: openai in frontmatter to use a different provider for a specific file.

Frontmatter

Control behavior per file with YAML frontmatter:

---
provider: openai
model: gpt-4o
temperature: 0.7
systemPrompt: You are a concise technical writer.
---

## User
Explain this architecture diagram.

![diagram](https://github.com/Ayjc/md-chat/raw/HEAD/images/arch.png)
Field Description
provider Override the default provider (claudeCli, codexCli, openai, anthropic, openrouter, ollama)
model Model to use (e.g. sonnet, gpt-4o, claude-sonnet-4-6, qwen2.5:7b)
temperature Sampling temperature
maxTokens Max response tokens
systemPrompt System prompt prepended to the conversation
session Session ID (auto-managed, CLI providers only)
sessionProvider Provider that owns the session (auto-managed)

@[read] — Inline File Context

Reference source files directly in your prompt. The file content is sent to the AI, but your .md file stays clean:

## User
Please review this file:
@[read](https://github.com/Ayjc/md-chat/blob/HEAD/src/utils/config.ts)

Focus on lines 10-25:
@[read](https://github.com/Ayjc/md-chat/blob/HEAD/src/utils/config.ts#L10-L25)

The AI receives the actual file contents wrapped in a code fence with language detection. Your .md file keeps the @[read] syntax as-is.

Syntax:

  • @[read](https://github.com/Ayjc/md-chat/blob/HEAD/path) — read the entire file
  • @[read](https://github.com/Ayjc/md-chat/blob/HEAD/path#L10-L25) — read lines 10 through 25
  • @[read](https://github.com/Ayjc/md-chat/blob/HEAD/path#L42) — read a single line
  • Paths are relative to the .md file's directory

Commands

Command Shortcut Description
Send to AI Ctrl+Enter Send the conversation to the AI provider
Cancel Generation Escape Stop the streaming response
Extract Code Block Ctrl+Shift+E Save the code block at cursor to a file
Fork Chat Ctrl+Shift+F Branch the conversation into a new file
New Chat File — Create a new .md chat from template
Select Provider — Switch between providers
Select Model — Set model for the current file
Reset Session — Clear session to start a fresh conversation

Extract Code Block

When the AI gives you a code block, press Ctrl+Shift+E to save it to a file. The file extension is auto-detected from the language label (e.g. ```typescript → .ts).

Fork Chat

Press Ctrl+Shift+F to branch the conversation at the current position. Creates a new file with the history up to that point, clears the session, and adds an empty ## User section — ready for a different direction.

Settings

General

Setting Default Description
mdChat.defaultProvider claudeCli Default provider
mdChat.fallbackProviders ["codexCli"] Fallback provider chain
mdChat.showMetadata true Show provider/model/time after each response
mdChat.showThinking true Show thinking/reasoning trace blocks when the provider emits them
mdChat.showToolUse true Show tool-use trace blocks when the provider emits them
mdChat.streamUpdateInterval 80 Streaming flush interval (ms)

CLI Providers

Setting Default Description
mdChat.providers.claudeCli.command claude Path to Claude CLI
mdChat.providers.claudeCli.defaultModel sonnet Default Claude model
mdChat.providers.codexCli.command codex Path to Codex CLI
mdChat.providers.codexCli.defaultModel — Default Codex model

API Providers

Setting Default Description
mdChat.providers.openai.apiKey — OpenAI API key
mdChat.providers.openai.baseUrl https://api.openai.com/v1 OpenAI endpoint
mdChat.providers.openai.defaultModel gpt-4o-mini Default OpenAI model
mdChat.providers.anthropic.apiKey — Anthropic API key
mdChat.providers.anthropic.baseUrl https://api.anthropic.com/v1 Anthropic endpoint
mdChat.providers.anthropic.defaultModel claude-sonnet-4-6 Default Anthropic model
mdChat.providers.openrouter.apiKey — OpenRouter API key
mdChat.providers.openrouter.baseUrl https://openrouter.ai/api/v1 OpenRouter endpoint
mdChat.providers.openrouter.defaultModel openai/gpt-4o-mini Default OpenRouter model
mdChat.providers.ollama.baseUrl http://127.0.0.1:11434/v1 Ollama endpoint
mdChat.providers.ollama.defaultModel qwen2.5:7b Default Ollama model

Requirements

  • VS Code 1.85+
  • At least one of:
    • Claude Code (claude on PATH)
    • Codex CLI (codex on PATH)
    • An API key for OpenAI, Anthropic, or OpenRouter
    • Ollama running locally

License

MIT

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft