Skip to content
| Marketplace
Sign in
Visual Studio Code>AI>DeerPromptNew to Visual Studio Code? Get it now.
DeerPrompt

DeerPrompt

deerflow

|
3 installs
| (0) | Free
Developing prompts in VSCode.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

DeerPrompt

Demo

DeerPrompt is an AI context engineering develop assistant.

Features

🤖 AI Chat Assistant

  • Intelligent dialogue system based on large language models
  • Support for multiple mainstream AI model integrations
  • Real-time response with smooth conversation experience

🛠️ Development Tool Integration

  • Direct debugging of prompt files in project engineering: Seamlessly integrated into your development workflow, supporting direct editing and debugging of prompt files within projects
  • Multi-turn conversation debugging support: Provides complete conversation history management, supporting debugging and optimization of multi-turn conversations
  • Custom model integration configuration support: Flexible configuration of various AI model parameters, supporting custom model integration and configuration management

🔧 Advanced Feature Support

  • Tools Support: Integrated rich toolset, supporting function calls and external API integration
  • MCP (Model Context Protocol) Support: Standardized model context protocol, ensuring compatibility with various AI services
  • Multi-modal Support: Support for sending and receiving image messages, enhancing multi-modal interaction capabilities
  • Token Calculation: Real-time display of token usage, helping optimize conversation costs

Provider Configuration

DeerPrompt supports flexible configuration of any OpenAI-compatible API providers, enabling seamless integration with various AI services. You can configure custom providers by specifying the base URL, API key, and optional headers/query parameters. The extension supports personalized model configurations including temperature, top-p, max tokens, reasoning effort, response format, and tool calls. Built-in presets are available for popular providers like OpenAI, Anthropic, DeepSeek, OpenRouter, Hugging Face, VolcEngine, and Aliyun, while custom providers can be easily added through the configuration interface.

Config

MCP Configuration

MCP

Call local tools via mcp

  1. Convert langchain's tool to FastMCP's tool using the to_fastmcp method provided by langchain_mcp_adapters. Currently, accessing InjectState in the tool is not supported.

    # tools.py
    from mcp.server.fastmcp import FastMCP
    from langchain_core.tools import tool
    from langchain_mcp_adapters.tools import to_fastmcp
    
    
    @tool(description="Say hello to the user")
    def hello(name: str) -> str:
       return f"Hello, {name}!"
    
    
    mcp = FastMCP("Demo", tools=[to_fastmcp(hello)])
    
    if __name__ == "__main__":
       mcp.run()
    
  2. Configure the MCP server using uv run tools.py command.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft