Skip to content
| Marketplace
Sign in
Visual Studio Code>Debuggers>Azure Data Factory Pipeline DebuggerNew to Visual Studio Code? Get it now.
Azure Data Factory Pipeline Debugger

Azure Data Factory Pipeline Debugger

Shas Vaddi

|
1 install
| (0) | Free
Step-through debugging for ADF/Synapse pipelines in VS Code. Set breakpoints on activities, dry-run with sample data, inspect intermediate outputs, and catch expression errors before you deploy.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Azure Data Factory Pipeline Debugger

Step-through debugging for ADF / Synapse pipelines without leaving VS Code.

Load pipeline JSON → set breakpoints on activities → dry-run with sample data → inspect intermediate outputs at each stage.
Highlights data type mismatches, missing linked service configs, and expression errors before you deploy.


Features

Feature Description
Load & Parse Open any ADF / Synapse pipeline JSON and instantly see structure & dependencies
Set Breakpoints Click the red dot on any activity or press Toggle Breakpoint inside the JSON
Step-Through Debug Start (F5) → Step Over (F10) → Step Into (F11) → Continue → Stop (Shift+F5)
Dry-Run Simulation Every activity type is simulated — Copy, Lookup, ForEach, IfCondition, Switch, Web, Notebook, SQL, and more
Intermediate Outputs Click any activity to inspect its output JSON after execution
Expression Evaluator Full ADF expression language support — @pipeline(), @activity(), @concat(), @if(), 60+ functions
Diagnostics Linked service validation, dataset existence checks, schema compatibility, expression errors, policy warnings
Variable Inspector Watch pipeline variables and parameters change in real time
Sample Data Feed sample JSON into any activity to control simulation output

Getting Started

  1. Install the extension (VSIX or Marketplace)
  2. Open a folder containing your ADF / Synapse pipeline JSON files
  3. Right-click a pipeline JSON → ADF Debugger: Load Pipeline
  4. The debug panel opens with your activities listed
  5. Click activities to set breakpoints → press ▶ Start

Commands

Command Keybinding Description
ADF Debugger: Load Pipeline — Load a pipeline JSON into the debugger
ADF Debugger: Start / Resume F5 Start or continue debug execution
ADF Debugger: Step Over F10 Execute the next activity
ADF Debugger: Step Into F11 Step into container activities
ADF Debugger: Continue F5 Run until next breakpoint or end
ADF Debugger: Stop Shift+F5 Stop the current debug session
ADF Debugger: Toggle Breakpoint — Set/remove breakpoint on activity at cursor
ADF Debugger: Validate Pipeline — Run full diagnostics without debugging
ADF Debugger: Set Sample Data — Provide sample output JSON for an activity

Diagnostic Categories

The debugger checks for:

  • Expression errors — invalid syntax, unknown functions, wrong argument counts
  • Type mismatches — incompatible column types between Copy source and sink
  • Missing linked services — referenced services not found in workspace
  • Missing datasets — referenced datasets not found in workspace
  • Missing dependencies — dependsOn references to non-existent activities
  • Circular dependencies — cycles in the activity dependency graph
  • Unreachable activities — activities that can never execute due to graph structure
  • Schema mismatches — source/sink column type incompatibilities in Copy activities
  • Policy warnings — excessive retry counts or invalid timeout formats
  • Variable/parameter errors — references to undefined variables or parameters

Pipeline JSON Structure

The debugger expects standard ADF / Synapse pipeline JSON:

{
  "name": "MyPipeline",
  "properties": {
    "activities": [
      {
        "name": "CopyData",
        "type": "Copy",
        "dependsOn": [],
        "typeProperties": {
          "source": { "type": "AzureSqlSource" },
          "sink": { "type": "AzureBlobSink" }
        },
        "inputs": [{ "referenceName": "SqlDataset", "type": "DatasetReference" }],
        "outputs": [{ "referenceName": "BlobDataset", "type": "DatasetReference" }]
      },
      {
        "name": "LogResult",
        "type": "WebActivity",
        "dependsOn": [
          { "activity": "CopyData", "dependencyConditions": ["Succeeded"] }
        ],
        "typeProperties": {
          "url": "@concat('https://api.example.com/log/', pipeline().RunId)",
          "method": "POST"
        }
      }
    ],
    "parameters": {
      "env": { "type": "String", "defaultValue": "dev" }
    },
    "variables": {
      "counter": { "type": "Integer", "defaultValue": 0 }
    }
  }
}

Settings

Setting Default Description
adfDebugger.linkedServiceFolder linkedService Folder name for linked service JSON files
adfDebugger.datasetFolder dataset Folder name for dataset JSON files
adfDebugger.sampleDataFolder .adf-debug/samples Folder for sample data files
adfDebugger.maxExpressionDepth 20 Max recursion depth for expression evaluation
adfDebugger.showWarnings true Show warnings in Problems panel

Supported Activity Types

Copy, Lookup, GetMetadata, Filter, ForEach, IfCondition, Switch, Until, Wait, SetVariable, AppendVariable, WebActivity, WebHook, ExecutePipeline, Notebook, SparkJob, SqlServerStoredProcedure, Script, Delete, Validation, Fail, Custom, ExecuteDataFlow, ExecuteSSISPackage.

License

MIT

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft