IBM® watsonx™ Code AssistantOverviewIBM® watsonx™ Code Assistant is an innovative, generative AI coding companion that offers robust, contextually aware assistance for popular programming languages including Go, C, C++, Java, JavaScript, Python, TypeScript, and more. Seamlessly integrated into your IDE, you can accelerate your productivity and simplify coding tasks, all with trust, security, and compliance. Note for IBM Granite.Code users: IBM Granite.Code is now part of the watsonx Code Assistant product portfolio. For individual users, you can continue to use Ollama to access a local IBM Granite model. For increased performance and new features, provision a trial of watsonx Code Assistant on IBM Cloud. For more information, see the IBM Cloud catalog. FeaturesGet code suggestionsUse chat conversations: Use natural language prompts to generate code suggestions. Use a chat conversation to enter a prompt that explains the code you need, and watsonx Code Assistant generates something you can choose to use. Reference code: To ask questions or refine a specific file, class, function, or method in your workspace, you can use a code reference. These references provide important context and can help to increase the accuracy of the answer. As part of your chat message, type the @ symbol to see a list of files, classes, and methods from your workspace. Click to select the reference, and watsonx Code Assistant sends the contents of the reference as part of your message. Code completion: Or, complete code in the editor. Start typing a line of code, then pause. IBM watsonx Code Assistant adds a code suggestion to complete the line that you typed. You can also get a multiline code suggestion. Start typing a line of code,then use a keyboard shortcut, and watsonx Code Assistant adds a multiline code suggestion. Or, enter a comment that describes the code you want. For more information, see the documentation for Getting code suggestions. Explain codeUse generative AI to analyze and summarize your code to understand what the code does. Click the Explain option that precedes a code block or enter For more information, see the documentation for Explaining code. Document codeGenerate comment lines that document what your code does. Click the Document option that precedes a code block or enter For more information, see the documentation for Documenting code. Generate unit testsCreate unit tests to evaluate your code functions. Click the Unit Test option that precedes a code block or enter For more information, see the documentation for Generating unit tests. Translate code from one language to anotherUse watsonx Code Assistant to translate code. In a chat conversation, use the syntax For more information, see the documentation for Translating code from one language to another. SetupProvision a watsonx Code Assistant service instance on IBM Cloud for your organization to get the best performance and the full set of features. Or for individuals, you can use Ollama to access a local IBM Granite model. Use a watsonx Code Assistant service instance on IBM CloudTo set up on IBM Cloud:
For more information, see the documentation for: Use watsonx Code Assistant with a local IBM Granite modelFor individual users, IBM watsonx Code Assistant can access a local model through Ollama, which is a widely-used local inferencing engine for LLMs. Ollama wraps the underlying model-serving project llama.cpp. Install the IBM watsonx Code Assistant extension
Install Ollama
Start the Ollama inference serverIn a console window, run:
Leave that window open while you use Ollama. If you receive the message Install the IBM Granite code modelGet started with IBM watsonx Code Assistant by installing the
Configure the Ollama hostBy default, the Ollama server runs on IP address
Configure the Granite model to useBy default, watsonx Code Assistant uses the To use a different model:
Securing your setupYour Visual Studio Code environmentIBM watsonx Code Assistant does not provide any additional security controls. It's recommended the following steps be taken to properly secure your setup:
Chat conversation storageIBM watsonx Code Assistant stores all your chat conversations locally in your file system under Telemetry dataIBM watsonx Code Assistant does not collect any telemetry data. In general, IBM watsonx Code Assistant does not send any data that it processes to a third party, IBM included. Connecting IBM watsonx Code Assistant and OllamaBy default, the Ollama server runs on IP address 127.0.0.1, port 11434, using http as a protocol, on your local device. To use https instead, or go through a proxy server, see the Ollama documentation. |