devhelperai READMEDeveloper productivity can be significantly impacted by the tools they use. It is estimated that developers spend up to 30% or more of their time on routine and manual tasks, which leads to decreased productivity, burnout, and dissatisfaction. Recently, it has become evident that LLMs (Large Language Models) can help address several of the productivity challenges faced by developers in software development tasks. As a result, there is a growing need for AI-based coding assistants to enhance developer productivity. This is akin to having an experienced software developer available at all times. We present DevHelperAI, a chat-based, context-aware, and flexible VSCode AI assistant extension, packed with productivity-enhancing features. This documentation is in progress and will be updated soon! Thanks for your patience. Features(1) Codebase Parsing and Storage (2) Question Answering (3) Function Dependency discovery (4) Context-aware code and test generation (5) Code review assistant (6) Full control over the LLM RequirementsPlease install Ollama by visiting https://ollama.com/download and following the installation instructions for your operating system. Running Ollama locally is required, as the extension relies on it to handle model downloads, deployment, and resource allocation for inference. Extension Settings
Known IssuesThis extension has not been tested on Apple Silicon. Please use with caution. More development in progress. Release Notes0.0.5Update README 0.0.3Some bug fixes 0.0.1Initial release of DevHelperAI Enjoy! |