Skip to content
| Marketplace
Sign in
Visual Studio Code>Debuggers>Azure Service Intelli DebuggerNew to Visual Studio Code? Get it now.
Azure Service Intelli Debugger

Azure Service Intelli Debugger

Soubhik Dev Tools

|
2 installs
| (0) | Free
Enterprise-grade governance and autonomous self-healing platform for Azure (Logic Apps Standard/Consumption, ADF, and more). Monitor failures, perform AI-driven root cause analysis, and automate remediation.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Azure Service Intelli Debugger

Azure Service Intelli Debugger is an enterprise-grade governance and autonomous self-healing platform for Azure services. It currently supports Azure Logic Apps (Standard & Consumption), Azure Data Factory (ADF) Pipelines, and is extensible via a plugin-based architecture for other Azure resources.

Prerequisites

Ensure the following components are installed before using the extension:

  1. Azure CLI (az): Required for resource discovery and authentication.
    • Install Azure CLI
  2. VS Code: Version 1.85.0 or higher.
  3. Azure Subscription: Access to Azure Logic App (Standard or Consumption), Azure Data Factory (ADF), or other supported Azure resources.

Getting Started

1. Installation

Install the extension via the VS Code Marketplace or by importing the provided .vsix file into your VS Code extensions view.

2. Launching the Dashboard

The primary interface for the extension is accessed through the VS Code Command Palette:

  1. Press Ctrl + Shift + P (Windows/Linux) or Cmd + Shift + P (macOS).
  2. Type "Azure Smart Debugger: Open Dashboard" and press Enter.

3. Azure Authentication

Upon initial launch, you must authenticate with your Azure account:

  1. Click the Connect button in the dashboard header.
  2. A notification will display a Device Authentication Code.
  3. Select Copy Code & Open Browser.
  4. Enter the code in the Microsoft authentication page and complete the sign-in process.
  5. Return to VS Code and select the desired Subscription when prompted.

Feature Overview

Observability Panel

Once connected, select a Resource Group and the target Azure resource (e.g., Logic App Standard, Logic App Consumption, or ADF Pipeline). The dashboard will retrieve recent execution failures, providing:

  • Run ID and relevant service metadata
  • Detailed Root Cause Analysis (RCA)
  • Failure trends and frequency

AI-Driven Self-Healing

  1. Select the failed runs requiring remediation.
  2. Select Run Self-Healing.
  3. The system will analyze the failure and generate a proposed fix or updated resource definition.
  4. Select Show Diff to review the proposed changes against the original source code.

AI Configuration

Access the Provider Settings tab to:

  • Select an AI engine (Gemini, OpenAI, or Ollama - Local AI with Zero Configuration).
  • Configure API credentials and routing strategies.

🚀 NEW: Ollama Local AI - Zero Configuration

The extension now supports fully automatic Ollama setup! When you select Ollama as your AI provider:

✅ Automatic Installation - Downloads and installs Ollama if not present ✅ Auto Model Download - Downloads the default AI model (llama3.2:1b) ✅ Service Management - Starts and monitors the Ollama service automatically ✅ Zero User Intervention - Everything happens in the background ✅ 100% Privacy - All data stays on your machine, no cloud API calls

First-time setup takes 3-7 minutes (one-time only). Subsequent uses are instant!

For detailed information, see OLLAMA_AUTO_SETUP.md

Pull Request Integration

To deploy a verified fix:

  1. Enter the target GitHub Repository (owner/repo).
  2. Select Create Pull Request.
  3. The extension will create a new branch, commit the updated resource definition, and initialize a Pull Request for peer review.

Configuration Settings

Extension behavior can be customized via VS Code Settings (Ctrl + ,):

Setting Default Description
smartLogicApp.googleApiKey "" API Key for Google Gemini.
smartLogicApp.openaiApiKey "" API Key for OpenAI.
smartLogicApp.defaultProvider "gemini" The default AI engine (gemini, openai, or ollama).
smartLogicApp.routingStrategy "explicit" AI model routing behavior.
smartLogicApp.enableFallback true Enable automatic fallback to other providers.
smartLogicApp.ollamaModel "llama3.2:1b" Ollama model to use (auto-downloaded).

Troubleshooting

  • Azure CLI Command Not Found: Verify that az is correctly installed and present in your system PATH environment variable.
  • Session Expiration: If authentication fails, use the Clear Session option in the connection menu to reset the local authentication cache.
  • Resource Visibility: Ensure your Azure account has the necessary permissions (Reader or Contributor) on the target subscription.

Developed with ❤️ for the Azure Community by Soubhik and AI Agent.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft