Bot Typist for Visual Studio CodeQ: Yet another AI tool? What does this one do?A: Bot Typist lets you chat with an AI bot, but in a Jupyter notebook that you've opened in VS Code. Or to put it another way, it types things into cells that came from a bot. Bot Typist. See? Q: Okay, why would I want that?A: If you're like me, it's because you think ChatGPT's ~~Code Interpreter~~ Advanced Data Analysis feature is a fun toy, but a terrible substitute for a notebook. Having a conversation about code in a Python notebook that's running on your own machine has a lot of advantages:
Also, you might want to use some language besides Python? So far, I've added support for TypeScript. Q: Will Bot Typist automatically run the code that the bot generates?A: No, and that's intentional. Since there's no sandbox, I think it's a bit too risky. Instead, you can run the cells yourself. (Hopefully after reading them!) But the result is still much like Code Interpreter. If running a cell fails with an error, you don't need to say anything, just run the command to get a bot response. The cell's outputs, including any errors, will be included in the prompt. GPT4 will see it, apologize, and try to fix it. It's a little less magical, but this is how Code Interpreter does it anyway. Also, after getting corrected code, you can delete the mistaken code and the apology if you like. That will conserve context window (and money) as the conversation gets longer. Another problem this avoids is that when using Code Interpreter, it will often read interpreter output and lie to you about it, claiming that a test passed when it actually failed. I find it's better to read it myself. If you find the output surprising or confusing, you could ask questions about it, though. Q: Which bots can I chat with?A: I mostly use GPT4, but with a little work, you could use any bot you like. Bot Typist uses Simon Willison's llm command to communicate with the AI. The llm command supports several bot API's. It can also run a language model locally on your machine if you've set it up for that. (If it doesn't support the one you want, you could even write a plugin to do it.) Q: Why should I write code this way instead of using Copilot?A: I mostly do it for fun. I've learned a few things about Python programming, too. For example, there's a library called numpy that's pretty cool. :) A more practical use for Bot Typist might be writing coding tutorials? A raw conversation with a bot probably wouldn't make a good tutorial, but you could edit the transcript into something nice. Q: Isn't this the same as (some other tool)?A: I probably don't know about it because there are a zillion other AI tools and I can't be bothered to look through them. If it's a good one, maybe me know what you found? FeaturesBot Typist adds three commands:
This creates a Jupyter notebook with a cell explaining what to do.
This is the only command you really need. After typing something into a cell, run this command to add the bot's reply. Since bots are often slow, the reply will stream in like a proper chat should. (Hit escape to interrupt.) This command is bound to
This command opens a new editor with the prompt that would be sent to llm for the current cell. (It also displays the system prompt.) Bot Typist sends everything from in current cell and all previous cells, except that it stops at a horizontal rule in a Markdown cell. You can use a horizontal rule to mark the beginning of a chat, or as a barrier to avoid sending too large a prompt. And that's all. Not much to it. RequirementsIt's up to you to install the llm command and make sure it works. This will include adding whatever API key you need. You also need to install and configure Jupyter. I like to use miniconda and set up two different Python environments, one for llm and the other for Jupyter, but you can do it however you like. SettingsRequired:
Optional:
All these settings can be customized for each programming language. Known Issues
Release Notes0.4.0 - "Deno notebooks are a thing now."
0.3.0 - "Stopping is important"
0.2.0 - "LLM's have lots of options"Add settings to customize how the llm command is called. 0.1.0 - "It works on my machine"First release. Let's see if anyone likes it. |