PraxCode - AI Code Assistant for VS Code
_____ _____ _
| __ \ / ____| | |
| |__) | __ __ ___ _| | ___ __| | ___
| ___/ '__/ _` \ \/ / | / _ \ / _` |/ _ \
| | | | | (_| |>
Your intelligent coding companion with deep codebase understanding
PraxCode is a powerful, context-aware AI code assistant for Visual Studio Code that brings the capabilities of premium AI coding tools directly into your editor. Unlike other AI coding extensions, PraxCode gives you the freedom to choose between local and cloud LLM providers while maintaining privacy through local vector indexing of your codebase.
✨ Key Features
🧠 Deep Codebase Understanding
PraxCode indexes your entire codebase to provide truly context-aware assistance. The built-in Retrieval-Augmented Generation (RAG) system ensures that AI responses are grounded in your specific code, not just generic knowledge.
[Screenshot: PraxCode analyzing a codebase and providing context-aware responses]
💬 Intelligent Chat Interface
The sidebar chat interface provides a seamless way to interact with your AI assistant. Ask questions about your code, request explanations, or get suggestions for improvements - all with the context of your entire codebase.
[Screenshot: PraxCode's chat interface showing a conversation about code]
🔍 Code Explanation
Select any code snippet and get a detailed explanation with the "Explain Code" command. PraxCode analyzes the selected code in the context of your entire codebase to provide comprehensive explanations.
[Screenshot: PraxCode explaining a complex code snippet with context from the codebase]
🌐 Multi-Provider Support
Choose the AI model that works best for you:
- Local Models via Ollama (Llama, CodeLlama, Mistral, etc.)
- Cloud Providers including OpenAI (GPT-3.5, GPT-4)
- Coming Soon: Anthropic (Claude), Google (Gemini), and more
🔒 Privacy-Focused
When using local models, your code never leaves your machine. The vector store is maintained locally, ensuring your intellectual property remains secure.
📋 Requirements
- Visual Studio Code 1.99.0 or higher
- For local models: Ollama installed and running
- For cloud models: API keys for the respective services
🚀 Installation
- Install the extension from the VS Code Marketplace
- Configure your preferred LLM provider in the settings
- Index your workspace using the "PraxCode: Index Workspace" command
🛠️ Setup Guide
Setting Up Local Models with Ollama
Install Ollama:
- Download and install from ollama.ai
- Follow the installation instructions for your operating system
Pull a Model:
# For general coding assistance
ollama pull llama3
# For specialized code understanding
ollama pull codellama
Start the Ollama Server:
- Ollama typically runs as a background service
- Verify it's running by visiting http://localhost:11434/ in your browser
Configure PraxCode:
- Open VS Code settings
- Set
praxcode.llmProvider
to ollama
- Set
praxcode.ollamaUrl
to http://localhost:11434
(or your custom URL)
- Set
praxcode.ollamaModel
to your preferred model (e.g., llama3
or codellama
)
Setting Up Cloud Providers
OpenAI Setup:
- Create an account at OpenAI
- Generate an API key in your account dashboard
- In VS Code settings:
- Set
praxcode.llmProvider
to openai
- When prompted, enter your API key (it will be stored securely)
Other Providers (Coming Soon):
- Similar setup process with provider-specific API keys
- Configuration through the same settings interface
Indexing Your Workspace
- Open the command palette (
Ctrl+Shift+P
or Cmd+Shift+P
)
- Run the command "PraxCode: Index Workspace"
- A progress notification will appear showing the indexing status
- Once complete, PraxCode will have a deep understanding of your codebase
[Screenshot: PraxCode indexing a workspace with progress indicator]
💻 Usage Examples
Asking Questions About Your Code
- Open the PraxCode sidebar by clicking the PraxCode icon in the activity bar
- Type your question in the chat input, for example:
- "How does the authentication system work in this project?"
- "Explain the data flow in the application"
- "What are the main components of this codebase?"
Getting Code Explanations
- Select a code snippet in your editor
- Right-click and select "PraxCode: Explain Selected Code" or use the command palette
- A detailed explanation will appear, analyzing the code in context
Customizing Indexing
By default, PraxCode indexes most code files while excluding common directories like node_modules
. You can customize this behavior:
- Open VS Code settings
- Modify
praxcode.indexing.includePatterns
to add specific file types
- Modify
praxcode.indexing.excludePatterns
to exclude certain directories or files
⚙️ Configuration Options
Setting |
Description |
Default |
praxcode.llmProvider |
The LLM provider to use |
ollama |
praxcode.ollamaUrl |
URL of the Ollama API server |
http://localhost:11434 |
praxcode.ollamaModel |
Model to use with Ollama |
llama3 |
praxcode.vectorStore.enabled |
Enable/disable vector store for indexing |
true |
praxcode.vectorStore.embeddingModel |
Embedding model for vector embeddings |
nomic-embed-text |
praxcode.indexing.includePatterns |
Glob patterns for files to include |
["**/*.{js,ts,jsx,tsx,py,java,c,cpp,cs,go,rb,php,html,css,md}"] |
praxcode.indexing.excludePatterns |
Glob patterns for files to exclude |
["**/node_modules/**", "**/dist/**", "**/build/**", "**/.git/**"] |
praxcode.ui.showStatusBarItem |
Show/hide the status bar item |
true |
praxcode.logging.logLevel |
Log level (debug, info, warn, error) |
info |
📚 Supported Models
Ollama Models
- Llama 3 (recommended)
- CodeLlama
- Mistral
- Phi-2
- Gemma
- And any other models supported by Ollama
OpenAI Models
- GPT-3.5 Turbo
- GPT-4
- GPT-4 Turbo
Coming Soon
- Anthropic Claude models
- Google Gemini models
- Custom model support
🔧 Commands
Command |
Description |
praxcode.indexWorkspace |
Index the current workspace for context-aware responses |
praxcode.explainCode |
Explain the currently selected code |
praxcode.showMenu |
Show the PraxCode menu with common actions |
🔒 Privacy & Security
- Local Processing: When using Ollama, all code processing happens locally on your machine
- Secure Storage: API keys for cloud providers are stored securely using VS Code's Secret Storage
- Selective Indexing: You have full control over which files are indexed through include/exclude patterns
- No Telemetry: PraxCode does not collect usage data or send your code to external servers
📝 Release Notes
See the CHANGELOG.md for detailed release notes.
Latest Release: v0.1.0
Initial release of PraxCode with the following features:
- Multi-provider LLM integration (Ollama, OpenAI)
- Contextual chat with RAG
- Code explanation
- Workspace indexing
📄 License
This extension is licensed under the MIT License.
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request or open an Issue on our GitHub repository.
Elevate your coding experience with PraxCode - where AI meets deep code understanding.
Note: Screenshots in this README will be updated with actual images in the next release.
Enjoy coding with PraxCode!