LSPAI is a powerful VS Code extension that revolutionizes unit test creation through Language Server Protocol (LSP) integration. Generate high-quality unit tests automatically and in real-time for multiple programming languages.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
LSPAI is a powerful VS Code extension that revolutionizes unit test creation through Language Server Protocol (LSP) integration. Generate high-quality unit tests automatically and in real-time for multiple programming languages.
✨ Key Features
🚀 Real-time unit test generation as you code
🌍 Multi-language support (Java, Go, Python)
🎯 Semantic-aware test generation using LSP
🎯 Project Status
IDE
Java
Python
Go
C++
TypeScript
Others
VS Code
✅
✅
✅
🚧
🚧
🚧
IntelliJ
🚧
🚧
🚧
🚧
🚧
🚧
Legend:
✅ Fully Supported
🚧 In Development
❌ Not Yet Supported
🛠️ Setup Guide
1. Configure Language Servers
Install Required Language Servers from VS Code Marketplace: if you pull the docker image, these are already downloaded.
Python: Pylance and Python extension ( identifier : ms-python.vscode-pylance, ms-python.python)
Go: Go extension ( identifier : golang.go)
Language-Specific Setup: if you pull the docker image, these are already setted.
For Go:
Enable semantic tokenization in your VS Code settings.json:
{
"gopls": {
"ui.semanticTokens": true
}
}
[Optional] Project Compilation
While not required, compiling your project can improve error diagnosis and auto-fixing capabilities
Refer Experiment Setup
2. Configure LLM Settings
LSPAI supports multiple LLM providers. Configure your preferred option in VS Code settings:
Open VS Code Settings (Ctrl/Cmd + ,)
Configure LLM settings using one of these methods:
Option A: Through VS Code Settings UI
Search for "lspAi" settings
Configure your preferred model and API keys
Option B: Direct JSON Configuration
Add the following to your VS Code User Settings (settings.json):
{
"lspAi": {
"model": "deepseek-chat", // Choose: // gpt-4o-mini"; // llama3-70b // deepseek-chat
"openaiApiKey": "your-api-key", // Required for OpenAI
"deepseekApiKey": "your-api-key", // Required for Deepseek
"localLLMUrl": "http://your-ollama-server:port", // Required for Ollama
"proxyUrl": "your-proxy-server" // Optional: Configure if using a proxy
}
}
The settings may be shaded, but don't worry, it is normal effect.
3. Generate Unit Tests
Now you are ready!!
To generate unit tests with LSPAI:
Open your workspace in VS Code
Navigate to the function/method you want to test
Right-click within the function definition
Select "LSPAI::Generate Unit Test" from the context menu
Wait for the test generation to complete( it may take 30-40 seconds )
Checkout Generated example and deploy!
🛠️ Experiment Reproduction Guide
LSPAI is published as a research paper (currently under review). For detailed instructions on reproducing our experimental results, please refer to our Experiments Documentation.
The experiments documentation includes:
Dataset preparation steps
Benchmark setup instructions
Evaluation metrics and procedures
Statistical analysis methods
Hardware and software requirements
🔍 Known Issues
X Server Display Error
[ERROR:ozone_platform_x11.cc(245)] Missing X server or $DISPLAY
Solution: Run commands with xvfb-run
📝 Release Notes
1.0.0
Initial release with support for Java, Go, and Python