A powerful AI code assistant for VS Code that integrates with Ollama to provide intelligent code assistance and natural language processing capabilities.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
A powerful AI code assistant for VS Code that integrates with Ollama to provide intelligent code assistance and natural language processing capabilities.
Features
AI-Powered Code Assistance: Get intelligent code suggestions and explanations
Natural Language Processing: Ask questions and get detailed responses
Multiple Model Support: Choose from various AI models including Llama 2, Mistral, Code Llama, and more
Seamless Integration: Access AI features directly from VS Code's sidebar
Code Highlighting: Beautiful syntax highlighting for code blocks in responses
Markdown Support: Rich text formatting in AI responses
Usage
Open VS Code
Access the Ollama Code Assistant through:
The sidebar icon (Ollama Code Assistant)
The command palette (Ctrl+Shift+P or Cmd+Shift+P)
The context menu in TypeScript files
Select your preferred AI model from the dropdown
Enter your prompt in the input field
Press Run or use Ctrl+Enter/Cmd+Enter to submit
Requirements
VS Code 1.99.0 or higher
Node.js 20.x or higher
Ollama installed and running locally
Configuration
The extension requires Ollama to be running locally. Make sure you have:
Ollama installed on your system
At least one AI model pulled (e.g., llama2, mistral, codellama)
Features in Detail
Code Assistance
Get code suggestions and explanations
Receive help with debugging
Generate code based on natural language descriptions
Natural Language Processing
Ask questions about programming concepts
Get explanations of complex topics
Receive detailed answers to technical questions
UI Features
Clean, modern interface
Syntax highlighting for code blocks
Markdown support for rich text formatting
Easy model selection
Quick access through VS Code's sidebar
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
If you encounter any issues or have suggestions, please open an issue in the repository.