Microsoft Fabric Lakehouse Developer
Author Fabric lakehouses, data pipelines, semantic models, and KQL querysets from VS Code. Preview OneLake data, manage shortcuts, and deploy across Fabric workspaces with CI/CD.
Features
🏗 Workspace Explorer
- Browse all Fabric workspaces from the Activity Bar
- Hierarchical view: Workspaces → Lakehouses / Pipelines / Semantic Models / KQL Databases / Notebooks
- Drill into lakehouses to see tables and files
📊 Lakehouse Management
- Create Lakehouses directly from VS Code
- Preview Table Data with column types, row counts, and scrollable grids
- Browse OneLake Files with size and last-modified metadata
- Create OneLake Shortcuts to OneLake, ADLS Gen2, Amazon S3, or Google Cloud Storage
🔄 Data Pipelines
- View pipeline definitions with activity flow visualization
- Create pipelines from templates (Empty, Copy Activity, Notebook)
- Run pipelines with status tracking
- Dependency graph showing activity relationships
📈 Semantic Models
- Browse tables, columns, and data types
- View DAX measures with expressions
- Inspect relationships and cardinalities
🔍 KQL Query Editor
- Execute KQL queries against Real-Time Intelligence databases
- Run query from active
.kql file or editor selection
- View results in a tabular grid with execution statistics
- Re-run queries directly from the results panel
🚀 Deployment Manager
- Deploy items across Fabric deployment pipeline stages
- Select specific items to deploy (Lakehouses, Pipelines, Notebooks, Semantic Models, etc.)
- Configure overwrite and creation options
- Progress tracking with long-running operation polling
Getting Started
Prerequisites
- Microsoft Fabric workspace with appropriate permissions
- Azure AD account with access to Fabric APIs
Connect to Fabric
- Open the Microsoft Fabric panel in the Activity Bar
- Click Connect to Workspace (or use
Ctrl+Shift+P → Fabric: Connect to Workspace)
- Sign in with your Microsoft account
- Your Fabric workspaces will appear in the explorer
Configuration
| Setting |
Description |
Default |
fabricLakehouse.tenantId |
Azure Tenant ID |
(empty) |
fabricLakehouse.defaultWorkspaceId |
Auto-connect workspace ID |
(empty) |
fabricLakehouse.previewRowLimit |
Max rows in table preview |
100 |
fabricLakehouse.apiEndpoint |
Fabric REST API base URL |
https://api.fabric.microsoft.com/v1 |
fabricLakehouse.oneLakeEndpoint |
OneLake DFS endpoint |
https://onelake.dfs.fabric.microsoft.com |
Detailed Walkthrough
1. Connect to Microsoft Fabric
- Install the extension and reload VS Code
- Look for the Microsoft Fabric icon in the Activity Bar (left sidebar)
- Click the icon to open the Workspaces panel
- You'll see a placeholder: "Click to connect to Fabric..." — click it
- A Microsoft sign-in prompt will appear — authenticate with your Azure AD account
- Once signed in, all Fabric workspaces your account can access will populate the tree
Tip: To auto-connect on startup, set fabricLakehouse.defaultWorkspaceId in VS Code settings with your workspace ID.
2. Browse Workspace Items
- Expand any workspace node in the tree
- You'll see five categories:
- 📦 Lakehouses — Delta Lake stores
- 🔄 Data Pipelines — ETL orchestration
- 📈 Semantic Models — Power BI datasets
- 🔍 KQL Databases — Real-Time Intelligence
- 📓 Notebooks — Spark notebooks
- Expand a category to list all items of that type
- Expand a Lakehouse to see its tables and a 📁 Files folder
3. Create a New Lakehouse
- In the Workspaces tree, right-click on a workspace node
- Select Fabric: Create Lakehouse
- Enter a name (e.g.,
sales-lakehouse) and press Enter
- Optionally add a description
- A progress notification will appear while the lakehouse is provisioned
- Once complete, click Refresh to see it in the tree
4. Preview Table Data
- Expand a Lakehouse node to see its tables
- Click the table icon (inline button) on any table, or right-click → Fabric: Preview Table Data
- A webview panel opens showing:
- Column names with data types
- A scrollable grid of rows (default: 100 rows)
- Row count summary (e.g., "Showing 100 of 1,245,302 rows")
- Adjust the preview row limit in settings:
fabricLakehouse.previewRowLimit
5. Browse OneLake Files
- Expand a Lakehouse node in the tree
- Click the 📁 Files folder (or right-click the lakehouse → Fabric: Browse OneLake Files)
- A webview panel opens showing:
- File and folder names with 📁/📄 icons
- File sizes (formatted as KB/MB/GB)
- Last modified timestamps
- This uses the OneLake DFS API (ADLS Gen2 compatible) to list contents under the lakehouse
Files/ path
6. Create an OneLake Shortcut
Shortcuts let you reference data in other locations without copying it.
- Right-click a Lakehouse node → Fabric: Create OneLake Shortcut
- Pick the target type:
- OneLake — another Fabric lakehouse
- ADLS Gen2 — Azure Data Lake Storage
- Amazon S3 — AWS S3 bucket
- Google Cloud Storage — GCS bucket
- Enter the shortcut name (e.g.,
external-sales)
- Provide the target path (e.g.,
/Tables/sales)
- Enter target-specific details:
- OneLake: Target workspace ID + item ID
- ADLS Gen2: Storage account, container, connection ID
- S3/GCS: Bucket name, connection ID
- The shortcut appears in the lakehouse immediately
7. View a Data Pipeline
- Expand Data Pipelines under a workspace
- Click the circuit-board icon on a pipeline, or right-click → Fabric: View Data Pipeline
- A webview opens showing:
- Activity cards arranged left-to-right with type badges (Copy = green, Notebook = purple, DataFlow = blue)
- Dependency arrows between activities
- Parameters table (if the pipeline has parameters)
- A ▶ Run Pipeline button at the top
8. Create a Data Pipeline
- Right-click a workspace node → Fabric: Create Data Pipeline
- Enter a pipeline name (e.g.,
ingest-daily-sales)
- Pick a template:
- Empty Pipeline — blank canvas
- Copy Activity — pre-configured copy from delimited text to Parquet
- Notebook Activity — runs a Fabric notebook
- The pipeline is created and visible after refreshing the tree
9. Run a Data Pipeline
- Right-click a pipeline node → Fabric: Run Data Pipeline
- Confirm the run in the modal dialog
- The pipeline job is submitted and you'll get a notification with the job info
- You can also run from inside the pipeline viewer webview using the ▶ Run Pipeline button
10. Execute a KQL Query
Option A — From the tree:
- Right-click a KQL Database node → Fabric: Run KQL Query
- The query service URI and database name are auto-populated
- Enter your KQL query (e.g.,
.show tables or MyTable | take 10)
Option B — From a .kql file:
- Open or create a file with a
.kql extension
- Write your KQL query
- Run
Ctrl+Shift+P → Fabric: Run KQL Query
- The extension reads the entire file (or just your selection) as the query
Option C — From any editor:
- Select text in any editor
- Run Fabric: Run KQL Query
- The selected text is used as the query
Results panel shows:
- Column headers with data types
- Tabular results grid
- Execution statistics (time, CPU, memory)
- A textarea + Run button to modify and re-run queries inline
11. View a Semantic Model
- Expand Semantic Models under a workspace
- Click the graph icon on a model, or right-click → Fabric: View Semantic Model
- A webview shows:
- Tables with all columns, data types, and hidden-column indicators (🔒)
- Measures with DAX expressions
- Relationships showing from/to columns with cardinality (OneToMany, ManyToMany, etc.)
12. Deploy Across Workspaces
This uses Fabric deployment pipelines (Dev → Test → Prod) for CI/CD.
- Right-click a workspace node → Fabric: Deploy to Workspace
- Pick a deployment pipeline from the list (created in the Fabric portal)
- Select the source stage (e.g., Development)
- Select the target stage (e.g., Test or Production)
- Choose which items to deploy — lakehouses, pipelines, notebooks, semantic models, etc.
- Set whether to allow overwriting existing items in the target
- Confirm deployment in the modal dialog
- A progress notification tracks the deployment until complete
- On success, you'll see: "✅ Deployed 5 items to Production successfully."
Prerequisite: Deployment pipelines must be created in the Fabric portal first. This extension uses existing pipelines — it doesn't create them.
Commands
| Command |
Description |
Fabric: Connect to Workspace |
Authenticate and load workspaces |
Fabric: Refresh Explorer |
Refresh the workspace tree |
Fabric: Create Lakehouse |
Create a new lakehouse in a workspace |
Fabric: Preview Table Data |
Preview rows from a lakehouse table |
Fabric: Browse OneLake Files |
Browse files in a lakehouse |
Fabric: Create OneLake Shortcut |
Create a shortcut to external data |
Fabric: View Data Pipeline |
Visualize pipeline activities |
Fabric: Run Data Pipeline |
Execute a data pipeline |
Fabric: Create Data Pipeline |
Create a new pipeline from a template |
Fabric: Run KQL Query |
Execute a KQL query |
Fabric: View Semantic Model |
Inspect a semantic model definition |
Fabric: Deploy to Workspace |
Deploy items across deployment stages |
API Coverage
This extension uses the following Microsoft Fabric APIs:
- Fabric REST API (
api.fabric.microsoft.com/v1) — Workspaces, Items, Lakehouses, Data Pipelines, Semantic Models, KQL Databases, Deployment Pipelines, Jobs
- OneLake DFS API (
onelake.dfs.fabric.microsoft.com) — ADLS Gen2-compatible file operations for browsing, uploading, and managing lakehouse files
License
MIT License — see LICENSE.txt
| |