Inference VS Code Extension
Official VS Code extension for the Inference programming language.
Features
Syntax Highlighting
Full syntax highlighting support for Inference language constructs:
- Keywords:
fn, struct, enum, type, const, let, pub, mut, spec, external
- Control Flow:
if, else, loop, break, return, assert
- Non-deterministic Constructs:
forall, exists, assume, unique, @ (uzumaki)
- Primitive Types:
i8, i16, i32, i64, u8, u16, u32, u64, bool
- Literals: strings, numbers (decimal, hex, binary, octal), booleans
- Comments: line (
//), documentation (///), and block (/* */)
Language Configuration
- Auto-closing brackets:
{}, [], (), "", ''
- Comment toggling with
Ctrl+/ (line) and Shift+Alt+A (block)
- Bracket matching and highlighting
- Code folding with
// #region and // #endregion markers
- Smart indentation for blocks
File Association
- Automatically activates for
.inf files
- Custom file icon for Inference source files
The extension provides comprehensive toolchain management through integration with the infs CLI. All operations are fully automated and require no manual configuration.
Automatic Detection
On activation, the extension automatically detects your toolchain using the following priority:
- Custom path from
inference.path setting
- Managed installation in
INFERENCE_HOME/bin/infs (respects INFERENCE_HOME environment variable)
- System
PATH
The detection result is displayed in the Configuration sidebar and logged to the Output channel.
A dedicated Inference icon appears in the VS Code activity bar. Click it to open the Configuration view with real-time toolchain information:
Toolchain Group:
- Binary path and detection source (settings/managed/path)
- Installed version number
INFERENCE_HOME directory location (default or custom)
- Detected platform (e.g.,
linux-x64, macos-arm64, windows-x64)
- Health status with diagnostic results
Settings Group:
inference.path - Custom binary path (click to configure)
inference.autoInstall - Auto-install prompt behavior
inference.checkForUpdates - Automatic update checking
Interactive Actions:
- Click any path item to copy its value to clipboard
- Right-click path items to reveal in file explorer
- Click status to run doctor diagnostics
- Use the refresh button in the title bar to reload
The view automatically refreshes when settings change or after install/update operations.
Terminal Integration
The extension automatically prepends INFERENCE_HOME/bin to PATH for all VS Code integrated terminals using the EnvironmentVariableCollection API:
- New terminals immediately have
infs and infc available
- Existing terminals show a relaunch indicator when the toolchain changes
- No VS Code restart required after installation or updates
- Works across all supported platforms
Status Bar
The bottom-left status bar shows real-time toolchain health. Click the status bar item to run full diagnostics via infs doctor.
Available Commands
Open Command Palette (Ctrl+Shift+P / Cmd+Shift+P):
- Inference: Install Toolchain - Download and install the latest
infs release for your platform
- Inference: Update Toolchain - Check for updates and install the latest version
- Inference: Select Toolchain Version - Browse and switch between available versions
- Inference: Run Doctor - Execute comprehensive health diagnostics
- Inference: Refresh Configuration - Reload the Configuration sidebar view
- Inference: Show Output - Open the Inference output log channel
- Inference: Reset PATH Fallback Preference - Clear saved PATH fallback acceptance
A guided setup walkthrough is available via Get Started: Open Walkthrough... > Get Started with Inference.
Installation
From VS Code Marketplace
- Open VS Code
- Press
Ctrl+P to open Quick Open
- Type
ext install inference-lang.inference
- Press Enter
From VSIX
- Download the
.vsix file from Releases
- In VS Code, press
Ctrl+Shift+P
- Type "Install from VSIX" and select the command
- Choose the downloaded
.vsix file
Configuration
Settings
inference.path (string, default: "") - Custom path to the infs binary. Leave empty for automatic detection. Scope: machine (not synced across devices).
inference.autoInstall (boolean, default: true) - Prompt to install toolchain if not found on activation.
inference.checkForUpdates (boolean, default: true) - Automatically check for toolchain updates on activation.
Environment Variables
INFERENCE_HOME - Override default toolchain directory (default: ~/.inference on Unix, %LOCALAPPDATA%\Inference on Windows)
INFS_DIST_SERVER - Override distribution server URL (for development/testing)
Automatic toolchain installation is supported on:
- Linux: x86_64 (glibc)
- macOS: ARM64 (Apple Silicon)
- Windows: x86_64
Other platforms can use the extension for syntax highlighting but must install the toolchain manually.
Example
/// Computes factorial using non-deterministic verification
pub fn factorial(n: i32) -> i32 {
let mut result: i32 = 1;
let mut i: i32 = 1;
loop {
if i > n {
break;
}
result = result * i;
i = i + 1;
}
// Verify the result using forall block
forall {
const witness: i32 = @;
assume {
const valid: bool = witness >= 0;
}
}
return result;
}
What is Inference?
Inference is a programming language designed for mission-critical applications development. It includes first-class support for formal verification via translation to Rocq (Coq) and targets WebAssembly as its primary runtime platform.
Key features:
- Formal Verification: Built-in support for proofs and specifications
- Non-deterministic Programming:
forall, exists, assume, unique constructs
- WebAssembly Target: Compiles to efficient WASM
- Rocq Translation: Generate Coq proofs from your code
Learn more:
Troubleshooting
- Check the Output panel (View > Output > Select "Inference")
- Run Inference: Run Doctor to see detailed diagnostics
- Verify
inference.path setting if using a custom location
- Try Inference: Install Toolchain to install automatically
Terminal commands not found
- Close all open terminals and open a new one (Terminal > New Terminal)
- The extension automatically adds
INFERENCE_HOME/bin to PATH
- For external terminals, add the path to your shell profile manually
Privacy
This extension does not collect telemetry, usage data, or any personal information. All toolchain operations communicate only with github.com/Inferara/inference/releases and inference-lang.org/releases.json.
Contributing
Contributions are welcome! Please see the main repository for contribution guidelines.
License
GPL-3.0 - See LICENSE for details.