VSCode Ollama Extension
English | 中文
VSCode Ollama is a powerful Visual Studio Code extension that seamlessly integrates Ollama's local LLM capabilities into your development environment.
✨ Features
🚀 Quick Start
📺 Tutorial
Install Ollama
# macOS
brew install ollama
# Linux
curl -fsSL https://ollama.com/install.sh | sh
Install Extension
- Open Extensions in VS Code
- Search for "VSCode Ollama"
- Click Install
Configure Extension
- Open Command Palette (Ctrl+Shift+P / Cmd+Shift+P)
- Type "Ollama: Settings"
- Configure server address and default model
Start Using
- Use command "Ollama: Open Chat" to start conversation
- Select model in chat interface
- Toggle web search
- Send message to interact
📝 Usage
Commands
Ollama: Open Chat - Open chat interface
Ollama: Settings - Open settings page
Shortcuts
Shift + Enter - New line in chat input
Enter - Send message
❤️ Support & Donation
If you find this extension helpful, you can support the developer by:
💰 Donation Methods
Support the developer
WeChat Pay
|
Alipay
|
🪙 Cryptocurrency
Bitcoin
|
Native Segwit
bc1qskds324wteq5kfmxh63g624htzwd34gky0f0q5
Taproot
bc1pk0zud9csztjrkqew54v2nv7g3kq0xc2n80jatkmz9axkve4trfcqp0aksf
|
Ethereum
|
0xB0DA3bbC5e9f8C4b4A12d493A72c33dBDf1A9803
|
Solana
|
AMvPLymJm4TZZgvrYU7DCVn4uuzh6gfJiHWNK35gmUzd
|
Your support helps maintain and improve this extension! Thank you! ❤️
- ⭐ Star the GitHub repository
- 📝 Submit issues or feedback
- 🚀 Contribute to the codebase
- 💬 Share with your friends
📝 Release Notes
See CHANGELOG.md for release notes.
📝 License
This extension is licensed under the MIT License.
Star History

| |