Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>Northstar Coding AssistantNew to Visual Studio Code? Get it now.
Northstar Coding Assistant

Northstar Coding Assistant

Sugataai

|
1 install
| (0) | Free
Agentic coding assistant powered by SUGATA AI
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Northstar Coding Agent

Agentic AI coding assistant for VS Code — powered by Sugata AI.

Run local or cloud LLMs. Spawn parallel agents. Browse the web. Write, test, and ship code — autonomously.

Version Installs [License]


Overview

Northstar is a multi-agent coding assistant that lives inside VS Code. Unlike single-turn AI assistants, Northstar can plan, act, and self-correct — executing sequences of real tool calls (file edits, terminal commands, browser searches, git operations) until a goal is complete.

It works with any OpenAI-compatible endpoint: local models via Ollama or llama.cpp, OpenRouter's free-tier models, or paid providers like OpenAI and Anthropic. No mandatory subscription. No data sent anywhere you haven't configured.


Features

Multi-Agent Task Execution

Spawn multiple AI sub-agents in parallel, each with its own goal and toolset. Northstar coordinates them, merges results, and handles conflicts — so long-running tasks don't block your session.

Universal LLM Support

Connect to any provider with a single URL and API key:

  • Local — Ollama, llama.cpp, LM Studio, or any OpenAI-compatible server
  • Cloud — OpenAI, Anthropic, OpenRouter (including free-tier models like deepseek/deepseek-chat-v3-0324:free)
  • Automatic model detection — Northstar probes your endpoint and selects the best available model

Full Tool Suite

Northstar agents act on your codebase — not just talk about it:

Tool What it does
File operations Read, write, create, move files across your workspace
Terminal Run shell commands, scripts, and build pipelines
Browser Full Chromium automation for research, scraping, and testing
Git Stage, commit, diff, branch — with guardrails against destructive ops
Code search Semantic and lexical search across your entire project
VS Code API Open files, show diffs, create diagnostics natively

Workstream Live View

Watch every tool call in real time as agents execute. See what's being read, written, searched, or run — with a full expandable activity log per session.

RAG — Workspace Intelligence

Northstar indexes your project with LanceDB (local vector database) and BM25 for hybrid retrieval. Agents automatically pull relevant context from your codebase before responding — no manual file pasting required.

Persistent Conversation Memory

Conversations are saved locally and restored on re-open. Switch between past sessions without losing context. Northstar links session history to the workspace it was opened in.

Thinking Mode

Toggle deep reasoning for complex architectural problems, refactoring decisions, or multi-step debugging. The agent narrates its reasoning before acting, making it easier to course-correct.

Built-in Guardrails

Northstar ships with a suite of safety checks that run during autonomous execution:

  • Loop protector — detects and breaks infinite reasoning cycles
  • Git protector — blocks force-pushes and destructive branch operations
  • Path guard — restricts file access to your workspace boundary
  • Command guard — audits shell commands before execution
  • Schema validator — ensures tool outputs are well-formed before injection into context

Getting Started

1. Install the extension

Search for Northstar Coding Agent in the VS Code Extensions panel, or install directly from the Visual Studio Marketplace.

2. Open the chat panel

Click the ✦ icon in the Activity Bar, or run Open Northstar Chat from the Command Palette (Ctrl+Shift+P / Cmd+Shift+P).

3. Configure your LLM provider

Click the Settings icon inside the Northstar panel. Set your provider type, base URL, and API key.

Quick start with a free model (no API key needed):

Provider : Local (OpenAI-Compatible)
URL      : https://openrouter.ai/api/v1
Model    : deepseek/deepseek-chat-v3-0324:free
API Key  : (your free OpenRouter key)

Quick start with a local model:

Provider : Ollama
URL      : http://localhost:11434/v1
Model    : qwen2.5-coder:7b   (or any model you have pulled)
API Key  : (leave blank)

Configuration Reference

All settings are available under Settings → Extensions → Northstar AI.

Setting Default Description
northstar.llmProvider Local (OpenAI-Compatible) Provider type: Local, OpenAI, Anthropic, or Ollama
northstar.llmUrl http://localhost:11434/v1 Base URL for the LLM API endpoint
northstar.llmApiKey (empty) API key for the provider (not needed for local)
northstar.llmModel (auto-detected) Model name override — leave blank for auto-detection
northstar.embeddingModel (auto-detected) Dedicated embedding model for RAG (e.g. mxbai-embed-large)

Commands

Command Description
Northstar: Open Chat Open the Northstar chat panel
Northstar: Clear Chat Clear the current conversation

Requirements

  • VS Code 1.109.0 or later
  • An LLM endpoint — local (Ollama / llama.cpp) or cloud (OpenAI, Anthropic, OpenRouter)
  • No other dependencies required — everything is bundled

Privacy

Northstar sends your messages only to the LLM provider you configure. No telemetry, no analytics, no data is sent to Sugata AI servers. Your codebase stays on your machine.


About Sugata AI

Northstar Coding Agent is built by Sugata AI — an autonomous enterprise AI platform. Our mission is to replace manual workflows with intelligent agentic systems that work alongside your team.


Marketplace · Issues · Sugata AI

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft