Overview Version History Q & A Rating & Review
DeepSeek Chat for VS Code
A Visual Studio Code extension that integrates DeepSeek AI chat directly into your editor's sidebar. It provides contextual AI assistance by understanding your active code files and selections.
Features
🤖 Chat with DeepSeek AI models (1.5B and 14B variants)
📝 Context-aware responses based on your active file or selection
⚡ Real-time streaming responses
🎨 Native VS Code theme integration
🔍 Syntax highlighting for code snippets
🛑 Ability to stop generation mid-stream
Prerequisites
Visual Studio Code ^1.99.0
Ollama installed locally
DeepSeek model pulled in Ollama
Installation
Install Ollama from ollama.ai
Pull the DeepSeek model:
ollama pull deepseek-r1:1.5b
ollama pull deepseek-r1:14b
Install this extension from the VS Code Marketplace
Configuration
You can customize which DeepSeek models are available in the extension:
Open VS Code Settings (File > Preferences > Settings)
Search for "DeepSeek Chat"
Under "Models", you can add or modify the available models
Default models are:
deepseek-r1:1.5b
deepseek-r1:14b
Usage
Open the DeepSeek Chat panel from the activity bar (look for the chat icon)
Select your preferred model size (1.5B or 14B)
Type your question in the input box
Toggle "Include active file context" if you want the AI to consider your current file
Press Enter or click "Ask Question"
Development
Clone the repository:
git clone https://github.com/cannidev/deepthroat-ai
cd deepthroat-ai
Install dependencies:
npm install
Start the development server:
npm run watch
Press F5 to open a new VS Code window with the extension loaded
Building
To create a VSIX package:
npm run package
Contributing
Fork the repository
Create your feature branch (git checkout -b feature/amazing-feature
)
Commit your changes (git commit -m 'Add some amazing feature'
)
Push to the branch (git push origin feature/amazing-feature
)
Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments