Elixium Coder
A local-first VS Code extension for AI-assisted coding. Talks only to the
endpoints you configure — your LLM provider and your MCP servers.
Install
Download elixium-coder-0.1.1.vsix from the Releases page and install:
code --install-extension elixium-coder-0.1.1.vsix
Or in VS Code: Extensions → … menu → "Install from VSIX…".
Build from source
git clone https://github.com/bereizy/elixium-coder.git
cd elixium-coder
pnpm install
pnpm vsix
# result: bin/elixium-coder-0.1.1.vsix
Requires Node 20+, pnpm 10+.
Providers
Five providers are supported. Pick one:
| Provider |
Use for |
| OpenAI-Compatible |
Ollama, LM Studio, vLLM, LocalAI, any OpenAI-compatible API |
| Anthropic |
Claude Opus / Sonnet / Haiku via https://api.anthropic.com |
| OpenAI |
GPT via https://api.openai.com (also covers Azure OpenAI) |
| OpenAI-Native |
OpenAI Responses API (reasoning models, stateful tool calls) |
| Bedrock |
Claude / Llama / Titan via AWS Bedrock |
Credentials are stored in VS Code secret storage.
MCP
MCP servers are configured via .mcp.json and speak only to the endpoints
you specify. See https://modelcontextprotocol.io/ for server setup.
Network posture
Elixium Coder's only outbound traffic is to:
- Your configured LLM endpoint
- Your configured MCP servers
No telemetry. No sign-in. No account required.
License
Apache License 2.0. See NOTICE for attribution.
Copyright 2026 IndirectTek.