OpenPilot is an open-source AI programming assistant as an extension to Visual Studio Code. Connect your editor to a variety of different Large Language Models including those from OpenAI and Google. Ask the LLM questions about your code base, generate new code snippets, or make changes to your existing files.
Context is automatically included
No need to copy & paste code into your chat. OpenPilot creates a vector store of the files in your workspace to match them based on the semantic content of your prompt. OpenPilot then sends the files along with your request so that the LLM has the context it needs to provide a better response.
Choose your LLM
Switch between any models you have access to from OpenAI or Google. More to come soon!
Bring your own keys
Indexing your workspace allows OpenPilot to automatically find files that are relevant to your chat. This requires an OpenAI key, even if you are chatting with a model that isn't GPT.
Chroma is a vector store that stores the embeddings generated from your workspace files.
Run the Index Workspace command
You can still chat even if you choose not to index, you'll just have to copy & paste any code that might be needed.
An option to connect to a locally-running LLM for better privacy.
Generate embeddings using a local model instead of OpenAI for better privacy and zero-cost file indexing and lookup.
GPT 3.5 is bad at following instructions about how to format diffs, at least with the prompts that have been tried so far. A different approach might be needed.
Support more 3rd-party LLMs
I've been sitting on the waitlist for other companies' APIs for quite a while. Maybe if you work for one of these companies you can hook me up 😉