Skip to content
| Marketplace
Sign in
Azure DevOps>Azure Pipelines>Databricks Deploy Files
Databricks Deploy Files

Databricks Deploy Files

moqku.co

|
5 installs
| (0) | Free
A custom task for Azure DevOps to deploy Databricks files between environments
Get it free

About the Databricks Deploy Files extension

This extension facilitates the deployment of Databricks files between environments as part of a CI/CD process. It supports deploying any type of file from an Azure DevOps repository branch or a source Databricks Workspace to a target Databricks Workspace.


File Format Preservation

All files retain their original format, enabling modular development (e.g., Python scripts), seamless integration of images into notebooks for documentation, explanations, and more.

Example of a notebook with an image and Python import

release example


Transfer folders

It is used to transfer the content of one or several folders provided as parameters from one environment to another


Transfer from branch to workspace, or workspace to workspace

Example of possible workflow in Azure Devops:

workflow example

Azure DevOps release

release example


Parameters definition

Folders to Deploy:

Provide a list of folders or a folder. (example if the folder to transfer is /Workspace/my_folder, the value to provide is my_folder)

  • Example of a list of folders (note: double quotes required): ["folder1", "folder2", "folder3/utilities"]

  • Example of a single folder (note: no double quotes): folder3/utilities

Source is Repository:

  • Boolean value to determine if the source is a repository
    • Checked: if code should be transferred from the branch linked to the release to a Databricks workspace
    • Unchecked: if code should be transferred from one Databricks workspace to another

Authentication Variables Location:

1. Pipeline Variables: If authentication variables are stored in pipeline variables

  • Databricks Source Workspace Parameters - Only required if Source is Repository is unchecked.

    • Source Databricks workspace URL: pipeline secret variable name for source Databricks workspace url
    • Source Databricks Workspace PAT: pipeline secret variable name for source Databricks workspace pat
  • Databricks Target Workspace Parameters

    • Target Databricks workspace URL: pipeline secret variable name for target Databricks workspace url
    • Target Databricks Workspace PAT: pipeline secret variable name for target Databricks workspace pat

2. Azure Key Vault: If authentication variables are stored in Azure Key Vault

  • Azure Authentication Parameters - Only required if authentication variables are stored in Azure Key Vault.

    • Service Principal ID: Provide Azure Application (client) ID (service principal ID)
    • Service Principal Key: Provide Azure service principal key (service principal key)
    • Azure Directory (Tenant ID): Provide Azure tenant ID
  • Databricks Source Workspace Parameters - Only required if Source is Repository is unchecked.

    • Azure Key Vault Name: Name of the Key Vault where secrets for the source Databricks workspace are stored
    • Source Databricks Workspace URL Secret Name: Name of the secret for the source Databricks workspace URL
    • Source Databricks Workspace PAT Secret Name: Name of the secret for the source Databricks workspace PAT (Personal Access Token)
  • Databricks Target Workspace Parameters

    • Azure Key Vault Name: Name of the Key Vault where secrets for the target Databricks workspace are stored
    • Target Databricks Workspace URL Secret Name: Name of the secret for the target Databricks workspace URL
    • Target Databricks Workspace PAT Secret Name: Name of the secret for the target Databricks workspace PAT

Exemple of parameterization

release example

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft