Skip to content
| Marketplace
Sign in
Visual Studio Code>Azure>Microsoft Fabric Lakehouse DeveloperNew to Visual Studio Code? Get it now.
Microsoft Fabric Lakehouse Developer

Microsoft Fabric Lakehouse Developer

Shas Vaddi

|
2 installs
| (0) | Free
Author Fabric lakehouses, data pipelines, semantic models, and KQL querysets from VS Code. Preview OneLake data, manage shortcuts, and deploy across Fabric workspaces with CI/CD.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Microsoft Fabric Lakehouse Developer

Author Fabric lakehouses, data pipelines, semantic models, and KQL querysets from VS Code. Preview OneLake data, manage shortcuts, and deploy across Fabric workspaces with CI/CD.

Features

🏗 Workspace Explorer

  • Browse all Fabric workspaces from the Activity Bar
  • Hierarchical view: Workspaces → Lakehouses / Pipelines / Semantic Models / KQL Databases / Notebooks
  • Drill into lakehouses to see tables and files

📊 Lakehouse Management

  • Create Lakehouses directly from VS Code
  • Preview Table Data with column types, row counts, and scrollable grids
  • Browse OneLake Files with size and last-modified metadata
  • Create OneLake Shortcuts to OneLake, ADLS Gen2, Amazon S3, or Google Cloud Storage

🔄 Data Pipelines

  • View pipeline definitions with activity flow visualization
  • Create pipelines from templates (Empty, Copy Activity, Notebook)
  • Run pipelines with status tracking
  • Dependency graph showing activity relationships

📈 Semantic Models

  • Browse tables, columns, and data types
  • View DAX measures with expressions
  • Inspect relationships and cardinalities

🔍 KQL Query Editor

  • Execute KQL queries against Real-Time Intelligence databases
  • Run query from active .kql file or editor selection
  • View results in a tabular grid with execution statistics
  • Re-run queries directly from the results panel

🚀 Deployment Manager

  • Deploy items across Fabric deployment pipeline stages
  • Select specific items to deploy (Lakehouses, Pipelines, Notebooks, Semantic Models, etc.)
  • Configure overwrite and creation options
  • Progress tracking with long-running operation polling

Getting Started

Prerequisites

  • Microsoft Fabric workspace with appropriate permissions
  • Azure AD account with access to Fabric APIs

Connect to Fabric

  1. Open the Microsoft Fabric panel in the Activity Bar
  2. Click Connect to Workspace (or use Ctrl+Shift+P → Fabric: Connect to Workspace)
  3. Sign in with your Microsoft account
  4. Your Fabric workspaces will appear in the explorer

Configuration

Setting Description Default
fabricLakehouse.tenantId Azure Tenant ID (empty)
fabricLakehouse.defaultWorkspaceId Auto-connect workspace ID (empty)
fabricLakehouse.previewRowLimit Max rows in table preview 100
fabricLakehouse.apiEndpoint Fabric REST API base URL https://api.fabric.microsoft.com/v1
fabricLakehouse.oneLakeEndpoint OneLake DFS endpoint https://onelake.dfs.fabric.microsoft.com

Detailed Walkthrough

1. Connect to Microsoft Fabric

  1. Install the extension and reload VS Code
  2. Look for the Microsoft Fabric icon in the Activity Bar (left sidebar)
  3. Click the icon to open the Workspaces panel
  4. You'll see a placeholder: "Click to connect to Fabric..." — click it
  5. A Microsoft sign-in prompt will appear — authenticate with your Azure AD account
  6. Once signed in, all Fabric workspaces your account can access will populate the tree

Tip: To auto-connect on startup, set fabricLakehouse.defaultWorkspaceId in VS Code settings with your workspace ID.


2. Browse Workspace Items

  1. Expand any workspace node in the tree
  2. You'll see five categories:
    • 📦 Lakehouses — Delta Lake stores
    • 🔄 Data Pipelines — ETL orchestration
    • 📈 Semantic Models — Power BI datasets
    • 🔍 KQL Databases — Real-Time Intelligence
    • 📓 Notebooks — Spark notebooks
  3. Expand a category to list all items of that type
  4. Expand a Lakehouse to see its tables and a 📁 Files folder

3. Create a New Lakehouse

  1. In the Workspaces tree, right-click on a workspace node
  2. Select Fabric: Create Lakehouse
  3. Enter a name (e.g., sales-lakehouse) and press Enter
  4. Optionally add a description
  5. A progress notification will appear while the lakehouse is provisioned
  6. Once complete, click Refresh to see it in the tree

4. Preview Table Data

  1. Expand a Lakehouse node to see its tables
  2. Click the table icon (inline button) on any table, or right-click → Fabric: Preview Table Data
  3. A webview panel opens showing:
    • Column names with data types
    • A scrollable grid of rows (default: 100 rows)
    • Row count summary (e.g., "Showing 100 of 1,245,302 rows")
  4. Adjust the preview row limit in settings: fabricLakehouse.previewRowLimit

5. Browse OneLake Files

  1. Expand a Lakehouse node in the tree
  2. Click the 📁 Files folder (or right-click the lakehouse → Fabric: Browse OneLake Files)
  3. A webview panel opens showing:
    • File and folder names with 📁/📄 icons
    • File sizes (formatted as KB/MB/GB)
    • Last modified timestamps
  4. This uses the OneLake DFS API (ADLS Gen2 compatible) to list contents under the lakehouse Files/ path

6. Create an OneLake Shortcut

Shortcuts let you reference data in other locations without copying it.

  1. Right-click a Lakehouse node → Fabric: Create OneLake Shortcut
  2. Pick the target type:
    • OneLake — another Fabric lakehouse
    • ADLS Gen2 — Azure Data Lake Storage
    • Amazon S3 — AWS S3 bucket
    • Google Cloud Storage — GCS bucket
  3. Enter the shortcut name (e.g., external-sales)
  4. Provide the target path (e.g., /Tables/sales)
  5. Enter target-specific details:
    • OneLake: Target workspace ID + item ID
    • ADLS Gen2: Storage account, container, connection ID
    • S3/GCS: Bucket name, connection ID
  6. The shortcut appears in the lakehouse immediately

7. View a Data Pipeline

  1. Expand Data Pipelines under a workspace
  2. Click the circuit-board icon on a pipeline, or right-click → Fabric: View Data Pipeline
  3. A webview opens showing:
    • Activity cards arranged left-to-right with type badges (Copy = green, Notebook = purple, DataFlow = blue)
    • Dependency arrows between activities
    • Parameters table (if the pipeline has parameters)
    • A ▶ Run Pipeline button at the top

8. Create a Data Pipeline

  1. Right-click a workspace node → Fabric: Create Data Pipeline
  2. Enter a pipeline name (e.g., ingest-daily-sales)
  3. Pick a template:
    • Empty Pipeline — blank canvas
    • Copy Activity — pre-configured copy from delimited text to Parquet
    • Notebook Activity — runs a Fabric notebook
  4. The pipeline is created and visible after refreshing the tree

9. Run a Data Pipeline

  1. Right-click a pipeline node → Fabric: Run Data Pipeline
  2. Confirm the run in the modal dialog
  3. The pipeline job is submitted and you'll get a notification with the job info
  4. You can also run from inside the pipeline viewer webview using the ▶ Run Pipeline button

10. Execute a KQL Query

Option A — From the tree:

  1. Right-click a KQL Database node → Fabric: Run KQL Query
  2. The query service URI and database name are auto-populated
  3. Enter your KQL query (e.g., .show tables or MyTable | take 10)

Option B — From a .kql file:

  1. Open or create a file with a .kql extension
  2. Write your KQL query
  3. Run Ctrl+Shift+P → Fabric: Run KQL Query
  4. The extension reads the entire file (or just your selection) as the query

Option C — From any editor:

  1. Select text in any editor
  2. Run Fabric: Run KQL Query
  3. The selected text is used as the query

Results panel shows:

  • Column headers with data types
  • Tabular results grid
  • Execution statistics (time, CPU, memory)
  • A textarea + Run button to modify and re-run queries inline

11. View a Semantic Model

  1. Expand Semantic Models under a workspace
  2. Click the graph icon on a model, or right-click → Fabric: View Semantic Model
  3. A webview shows:
    • Tables with all columns, data types, and hidden-column indicators (🔒)
    • Measures with DAX expressions
    • Relationships showing from/to columns with cardinality (OneToMany, ManyToMany, etc.)

12. Deploy Across Workspaces

This uses Fabric deployment pipelines (Dev → Test → Prod) for CI/CD.

  1. Right-click a workspace node → Fabric: Deploy to Workspace
  2. Pick a deployment pipeline from the list (created in the Fabric portal)
  3. Select the source stage (e.g., Development)
  4. Select the target stage (e.g., Test or Production)
  5. Choose which items to deploy — lakehouses, pipelines, notebooks, semantic models, etc.
  6. Set whether to allow overwriting existing items in the target
  7. Confirm deployment in the modal dialog
  8. A progress notification tracks the deployment until complete
  9. On success, you'll see: "✅ Deployed 5 items to Production successfully."

Prerequisite: Deployment pipelines must be created in the Fabric portal first. This extension uses existing pipelines — it doesn't create them.


Commands

Command Description
Fabric: Connect to Workspace Authenticate and load workspaces
Fabric: Refresh Explorer Refresh the workspace tree
Fabric: Create Lakehouse Create a new lakehouse in a workspace
Fabric: Preview Table Data Preview rows from a lakehouse table
Fabric: Browse OneLake Files Browse files in a lakehouse
Fabric: Create OneLake Shortcut Create a shortcut to external data
Fabric: View Data Pipeline Visualize pipeline activities
Fabric: Run Data Pipeline Execute a data pipeline
Fabric: Create Data Pipeline Create a new pipeline from a template
Fabric: Run KQL Query Execute a KQL query
Fabric: View Semantic Model Inspect a semantic model definition
Fabric: Deploy to Workspace Deploy items across deployment stages

API Coverage

This extension uses the following Microsoft Fabric APIs:

  • Fabric REST API (api.fabric.microsoft.com/v1) — Workspaces, Items, Lakehouses, Data Pipelines, Semantic Models, KQL Databases, Deployment Pipelines, Jobs
  • OneLake DFS API (onelake.dfs.fabric.microsoft.com) — ADLS Gen2-compatible file operations for browsing, uploading, and managing lakehouse files

License

MIT License — see LICENSE.txt

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft