Connects Visual Studio Code with the Gemini AI model via API.
Description
The LLM-Plugin extension enables communication between Visual Studio Code and the Gemini AI model via API calls. It enables users to send input prompts to the AI and retrieve intelligent suggestions directly within their coding environment.
Features
Sends user input to Gemini AI and retrieves suggestions.
Ensures reliable communication with the AI service using retry mechanism in case of network or server issues.
Encrypts and securely manages your API key locally.
Allows users to specify the storage path for their API key.
Dependencies
axios: Used for handling HTTP requests to interact with the Gemini AI API.
esbuild: A fast bundler for building the extension.
npm-run-all: Allows parallel and sequential running of scripts.
typescript: Provides TypeScript support for the extension.
eslint: Used for linting the TypeScript codebase to enforce code quality standards.
@typescript-eslint/parser: ESLint parser for TypeScript.
@typescript-eslint/eslint-plugin: ESLint plugin with TypeScript-specific rules.
@types/vscode: Provides type definitions for the VS Code API.
@types/node: Provides type definitions for Node.js.
@types/jest: Type definitions for Jest testing framework.
jest: Testing framework for running tests.
ts-jest: Jest plugin for TypeScript.
@vscode/test-cli: CLI for running VS Code extension tests.
@vscode/test-electron: Framework for testing extensions in the VS Code Electron environment.