DeerPrompt is an AI context engineering develop assistant.
Features
🤖 AI Chat Assistant
Intelligent dialogue system based on large language models
Support for multiple mainstream AI model integrations
Real-time response with smooth conversation experience
🛠️ Development Tool Integration
Direct debugging of prompt files in project engineering: Seamlessly integrated into your development workflow, supporting direct editing and debugging of prompt files within projects
Multi-turn conversation debugging support: Provides complete conversation history management, supporting debugging and optimization of multi-turn conversations
Custom model integration configuration support: Flexible configuration of various AI model parameters, supporting custom model integration and configuration management
🔧 Advanced Feature Support
Tools Support: Integrated rich toolset, supporting function calls and external API integration
MCP (Model Context Protocol) Support: Standardized model context protocol, ensuring compatibility with various AI services
Multi-modal Support: Support for sending and receiving image messages, enhancing multi-modal interaction capabilities
DeerPrompt supports flexible configuration of any OpenAI-compatible API providers, enabling seamless integration with various AI services. You can configure custom providers by specifying the base URL, API key, and optional headers/query parameters. The extension supports personalized model configurations including temperature, top-p, max tokens, reasoning effort, response format, and tool calls. Built-in presets are available for popular providers like OpenAI, Anthropic, DeepSeek, OpenRouter, Hugging Face, VolcEngine, and Aliyun, while custom providers can be easily added through the configuration interface.
MCP Configuration
Call local tools via mcp
Convert langchain's tool to FastMCP's tool using the to_fastmcp method provided by langchain_mcp_adapters. Currently, accessing InjectState in the tool is not supported.
# tools.py
from mcp.server.fastmcp import FastMCP
from langchain_core.tools import tool
from langchain_mcp_adapters.tools import to_fastmcp
@tool(description="Say hello to the user")
def hello(name: str) -> str:
return f"Hello, {name}!"
mcp = FastMCP("Demo", tools=[to_fastmcp(hello)])
if __name__ == "__main__":
mcp.run()
Configure the MCP server using uv run tools.py command.