Skip to content
| Marketplace
Sign in
Visual Studio Code>Programming Languages>Cencurity for VS Code (macOS)New to Visual Studio Code? Get it now.
Cencurity for VS Code (macOS)

Cencurity for VS Code (macOS)

Cencurity

| (0) | Free
Protect AI-generated code in proxy-routable AI workflows through the Cencurity security proxy. Marketplace build.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Cencurity for VS Code

Protect AI-generated code in real time. Cencurity routes LLM traffic through a security proxy that inspects streamed responses and enforces policy before code reaches developers.

Use it to connect VS Code and supported AI tools to Cencurity, manage a local proxy runtime, and keep proxy configuration aligned across your workspace.

It offers:

  • Guided connection setup for the Cencurity proxy
  • Secure API key storage for proxy setups that require one
  • Optional local proxy install, startup, and connection from a bundled local-pack
  • Status visibility directly in the VS Code status bar
  • Fast access to local protection and proxy controls

Prerequisites

  • A bundled local runtime included with the Marketplace package, or a reachable Cencurity proxy if your team provides one
  • A Cencurity API key only if your configured proxy requires one
  • A supported AI tool or workflow in VS Code that can route traffic through Cencurity

Quickstart

  1. Install the extension.
    • Install the Marketplace package for this operating system. Windows builds remain available through the main Cencurity Marketplace extension.
  2. Start protection.
    • On first startup, the extension attempts to install, start, and connect the bundled local runtime automatically.
    • You can also run Cencurity: Install Local Proxy manually to install, start, and connect in one step.
  3. If your configured proxy requires a Cencurity API key, enter it when prompted during connection.
  4. Start using your supported workflow through Cencurity.

On first startup, the extension can automatically install, start, and connect the bundled local runtime. You can also manage it manually with:

  1. Run Cencurity: Install Local Proxy.
  2. If needed later, run Cencurity: Start Local Proxy to restart a previously installed runtime.
  3. Auto-start is enabled for future sessions after a successful local install.

To verify that everything is working, open the status bar entry or run Cencurity: Show Connection Status.

What you can do with Cencurity

Connect your editor to the Cencurity proxy

The extension helps VS Code work with a Cencurity-managed proxy endpoint so you can centralize policy enforcement, inspection, and routing for AI traffic.

Run a local proxy without manual setup

When a bundled local-pack is included, the extension can install it directly from the extension package. If no bundled artifact is present, it can fall back to a trusted download source that you configure.

Keep configuration aligned across tools

The extension can help populate OpenAI-compatible base URL settings for integrated terminals and can optionally keep a .env file updated for tools that load environment variables from disk.

Monitor connection health

Use the status bar and built-in commands to check whether the proxy is reachable.

Use the full potential of Cencurity

Install and run a local proxy runtime

If your team distributes bundled local-pack artifacts, developers can install, start, and connect a local Cencurity runtime directly from the extension instead of setting up a proxy process manually.

Standardize proxy configuration across projects

Use terminal environment injection or optional .env updates so compatible tools resolve the same proxy endpoint consistently.

Make onboarding easier for teams

The extension can automatically install, start, and connect the bundled local runtime on first startup, then keep that runtime available for future sessions with auto-start enabled.

Works with

Cencurity for VS Code is designed for proxy-routable AI development workflows, including:

  • OpenAI-compatible AI coding tools that support custom base URLs
  • Claude- and Gemini-based tools or workflows when they support custom endpoints or proxy routing
  • CLI workflows that read OPENAI_BASE_URL or OPENAI_API_BASE
  • Local or remote Cencurity proxy deployments
  • Bundled platform-specific local runtimes for Windows, Linux, and macOS

IDE-native AI features that do not expose custom endpoint or proxy settings may require separate integration.

Marketplace notes:

  • Install the Marketplace package for this operating system. Windows builds remain available through the main Cencurity Marketplace extension.

How it works

Typical flow:

IDE or AI agent
?? Cencurity proxy
?? LLM provider

The extension itself is a connector and control surface. Proxy execution happens outside the editor, either through your existing Cencurity deployment or an optional local runtime.

Local proxy runtime

The optional local proxy mode is designed to reduce first-run friction.

  • Preferred installation source: a bundled local-pack included in the VSIX
  • Fallback installation source: cencurity.localProxy.downloadBaseUrl
  • Integrity check: cencurity.localProxy.sha256
  • Optional auto-start on VS Code launch

Recommended artifact format:

  • cencurity-localpack_<platform>_<arch>.zip

The local-pack can include the proxy, engine runtime, and rules needed for local execution.

Configure the extension

Open the extension settings in VS Code to manage Cencurity behavior.

To open extension settings:

  1. Open the Extensions view in VS Code.
  2. Select Cencurity for VS Code.
  3. Click the gear icon and choose Extension Settings.

Common settings include:

  • cencurity.proxyUrl
  • cencurity.healthCheckIntervalSeconds
  • cencurity.autoConfigureTerminalEnv
  • cencurity.autoConfigureDotenv
  • cencurity.dotenvFilePath
  • cencurity.localProxy.autoStart
  • cencurity.localProxy.promptToInstallOnStartup
  • cencurity.localProxy.port
  • cencurity.localProxy.args
  • cencurity.localProxy.downloadBaseUrl
  • cencurity.localProxy.artifactTemplate
  • cencurity.localProxy.sha256
  • cencurity.notifyOnConnectionFailure
  • cencurity.showOneTimeInstructions

Configuration tips

  • Use cencurity.proxyUrl to point the extension at a different Cencurity proxy when needed.
  • Use cencurity.localProxy.autoStart for a smoother local developer experience.
  • Use cencurity.autoConfigureTerminalEnv to keep integrated terminals pointed at Cencurity.
  • Use cencurity.autoConfigureDotenv when your tooling loads environment variables from a .env file.

Commands

Run these commands from the Command Palette:

  • Cencurity: Connect
  • Cencurity: Auto Configure Proxy
  • Cencurity: Install Local Proxy
  • Cencurity: Start Local Proxy
  • Cencurity: Stop Local Proxy
  • Cencurity: Set Proxy URL
  • Cencurity: Show Connection Status

Security and storage

If your configured proxy requires a Cencurity API key, it is stored using VS Code Secret Storage. It is not written to normal workspace settings by default. The bundled local runtime at http://localhost:8082 does not require a Cencurity API key.

Some supported tools still require manual endpoint or base URL configuration. Common variables for OpenAI-compatible workflows include OPENAI_BASE_URL and OPENAI_API_BASE.

Support

If you need help, want to report an issue, or want to track product updates:

  • Repository: https://github.com/cencurity/ide-plugin
  • Issues: https://github.com/cencurity/ide-plugin/issues

Development

npm install
npm run compile

Then press F5 from the extension project.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft