Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>Azure Databricks File ConverterNew to Visual Studio Code? Get it now.
Azure Databricks File Converter

Azure Databricks File Converter

DatabricksVsCodeConnect

|
2 installs
| (0) | Free
Convert Python/SQL/Scala files to Jupyter notebooks via Databricks
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Azure Databricks File Converter

A VS Code extension that seamlessly converts Python, SQL, and Scala files to Jupyter notebooks via Azure Databricks, and converts Jupyter notebooks back to source files.

Features

  • Convert to Notebook: Right-click on any .py, .sql, or .scala file and convert it to a Jupyter notebook
  • Convert to Source: Right-click on any .ipynb file and convert it back to the original source file format
  • Comprehensive Logging: All operations are logged in the output channel for debugging and tracking
  • Azure Databricks Integration: Leverages Databricks workspace and DBFS APIs for conversion

Requirements

  • Azure Databricks workspace
  • Personal Access Token for authentication
  • Active Databricks cluster (cluster ID)

Installation

  1. Install the extension from the VS Code marketplace (or load from VSIX)
  2. Configure your Azure Databricks connection

Configuration

Before using the extension, configure your Azure Databricks connection:

  1. Open the Command Palette (Ctrl+Shift+P or Cmd+Shift+P)
  2. Run the command: ADB: Configure Connection
  3. Provide the following information:
    • Databricks Workspace URL: e.g., https://adb-xxxxx.azuredatabricks.net
    • Personal Access Token: Your Databricks PAT (starts with dapi)
    • Cluster ID: Your Databricks cluster ID (format: 0123-456789-abc123)

Alternatively, you can set these in VS Code settings:

  • adbConverter.databricksUrl
  • adbConverter.token
  • adbConverter.clusterId

Usage

Convert Source File to Jupyter Notebook

  1. Right-click on a .py, .sql, or .scala file in the Explorer
  2. Select ADB: Convert to Jupyter Notebook
  3. The extension will:
    • Upload the file to Databricks shared folder with a temporary name
    • Import it as a notebook
    • Export it as a Jupyter notebook (.ipynb)
    • Clean up temporary files
  4. The notebook will be created in the same directory as the source file

Convert Jupyter Notebook to Source File

  1. Right-click on an .ipynb file in the Explorer
  2. Select ADB: Convert to Source File
  3. The extension will:
    • Upload the notebook to Databricks
    • Export it as a source file
    • Save the source file in the same directory
    • Delete the original .ipynb file
    • Clean up temporary files

Logging

All operations are logged to the "Azure Databricks Converter" output channel. To view logs:

  1. Open the Output panel (View > Output)
  2. Select "Azure Databricks Converter" from the dropdown

Logs include:

  • Operation start/completion
  • File upload/download progress
  • API calls to Databricks
  • Error messages and stack traces
  • Cleanup operations

How It Works

Source to Notebook Conversion

Source File (.py/.sql/.scala)
    ↓
Upload to DBFS (/Shared/filename_temp_uuid)
    ↓
Import to Workspace as Notebook
    ↓
Export as Jupyter Notebook
    ↓
Save locally as .ipynb
    ↓
Clean up DBFS and Workspace

Notebook to Source Conversion

Jupyter Notebook (.ipynb)
    ↓
Import to Databricks Workspace
    ↓
Export as Source File
    ↓
Save locally with appropriate extension
    ↓
Delete original .ipynb file
    ↓
Clean up Workspace

Supported File Types

  • Python: .py
  • SQL: .sql
  • Scala: .scala
  • Jupyter Notebooks: .ipynb

Troubleshooting

"Please configure Databricks connection first"

  • Run ADB: Configure Connection command and provide your credentials

"Failed to upload to DBFS"

  • Check your Databricks URL and token
  • Ensure the token has appropriate permissions
  • Verify network connectivity to your Databricks workspace

"Failed to import to workspace"

  • Verify your cluster ID is correct
  • Check that the file format is supported
  • Review logs in the output channel for detailed error messages

File not converting properly

  • Check the output channel for detailed logs
  • Ensure the source file has valid syntax
  • Verify the file extension is supported

Security

  • Your Personal Access Token is stored in VS Code settings
  • Tokens are transmitted over HTTPS to Azure Databricks
  • Temporary files are automatically cleaned up after conversion
  • Consider using workspace-level settings for shared configurations

Development

To build and run the extension locally:

# Install dependencies
npm install

# Compile TypeScript
npm run compile

# Watch for changes
npm run watch

Press F5 in VS Code to launch the Extension Development Host.

Commands

  • ADB: Convert to Jupyter Notebook - Convert source file to notebook
  • ADB: Convert to Source File - Convert notebook to source file
  • ADB: Configure Connection - Configure Databricks connection

Contributing

Contributions are welcome! Please feel free to submit issues or pull requests.

License

MIT

Release Notes

0.0.1

Initial release with basic functionality:

  • Convert Python/SQL/Scala files to Jupyter notebooks
  • Convert Jupyter notebooks back to source files
  • Configuration management
  • Comprehensive logging
  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft