Local Backup and Restore for Azure DevOps
Do you want to have full control of your Azure DevOps data? Do you want to be able to restore deleted data even after the standard 30 or 90 days have expired? Or perhaps you want to move your data from cloud to on prem?
We've got you covered!
Solidify’s Azure DevOps backup tool lets you backup your Azure DevOps data and save it within your organization so that you have full ownership of it.
Protecting your valuable Azure DevOps data has never been easier. Our tool provides a robust and reliable solution for backing up and restoring your Azure DevOps resources, ensuring the safety and integrity of your critical information.
Why Choose Azure DevOps Backup Tool?
- Comprehensive Backup: Effortlessly create backups of your Azure DevOps resources, whether it's a single project or your entire organization. Preserve important data such as work items, Git repositories, pipelines, teams and board configuration, project wikis, test plans, artifacts, and much more.
- Flexible Storage Options: Store your backups wherever you want in the network or in the cloud. Choose the storage option that best suits your organization's needs and ensure easy accessibility when you need to restore your data.
- Migration Made Simple: Seamlessly migrate your Azure DevOps resources between instances and projects. Whether you're reorganizing your workflows, moving to a new environment, or consolidating projects, our tool simplifies the migration process, saving you time and effort.
- Pipeline Integration: The Azure DevOps Backup Tool seamlessly integrates with Azure Pipelines, allowing you to incorporate backup and restore tasks directly into your pipeline workflows. Automate your backup operations and ensure consistent data protection with every build and release.
- Easy-to-Use Interface: Our user-friendly interface makes it simple to configure and schedule your backups. With just a few clicks, you can initiate backups, monitor their progress, and review detailed logs to ensure the integrity of your data.
Secure Your Azure DevOps Data Today
Don't leave your Azure DevOps data vulnerable to loss or corruption. With our tool, you can focus on your projects, knowing that your data is safe and can be easily restored whenever needed.
Everything through Azure Pipelines with a smooth and easy-to-use interface!
Still not sure? Try our free 7-day trial. Click the "Get It Now" button and get started!
Features
- Backup and restore ADO resources locally (Single project or whole organization)
- Migrate ADO resources between ADO instances and projects
- Back up all organizational data in a single ADO pipeline.
The following resources are supported as of today:
- Area/Iteration paths
- Teams and team settings
- Git repositories
- Pull requests
- Branch policies
- Project wikis
- Pipelines
- Build definitions
- Release definitions
- Task groups
- Variable groups
- Environments
- Deployment groups
- Work Items
- Revision history
- Comments
- Attachments
- Links
- Work Items links
- Code links (repo/commit/branch/tag)
- Work Item Queries
- Dashboards
- Work Item Process models
- Test plans
- Test suites
- Test cases
- NO test executions
- Artifacts
- Feeds and packages
- NuGet
- Npm
- Universal
- Python (export only)
Task overview
Azure DevOps Backup Tool provides a set of pipeline tasks that backs up Azure DevOps data as json files to a local file directory.
The extension consist of the following tasks:
- Azure DevOps Backup Tool: Export - Run export jobs via a pipeline task. Save the downloaded files to a local file area or network file share.
- Azure DevOps Backup Tool: Import - Run import jobs via a pipeline task. Read the files from a local file area or network file share.
Example
Example task setup for an export job (only work items)
# Run export job
- task: ado-backup-tool-export@1
displayName: 'ADO Backup Tool: Export'
inputs:
source: 'https://dev.azure.com/solidifydemo'
sourceOrgName: 'solidifydemo'
sourceProject: 'ContosoAir'
sourceUsername: 'john.doe@solidify.dev'
sourcePAT: '$(migrationToken)'
onPrem: false
workspace: 'C:\AdoBackupWorkspace'
resourceWorkItem: true
env:
SYSTEM_ACCESSTOKEN: $(system.accesstoken)
# Save a snapshot on the backup server (here denoted with X:\)
- task: ArchiveFiles@2
inputs:
rootFolderOrFile: 'C:\AdoBackupWorkspace'
includeRootFolder: false
archiveType: 'zip'
archiveFile: 'X:\AdoBackupFolder\Backup-$(Build.BuildNumber).zip'
replaceExistingArchive: false
Example task setup for an import job (only work items)
# Extract the snapshot from the backup server (here denoted with X:\)
- task: ExtractFiles@1
inputs:
archiveFilePatterns: 'X:\AdoBackupFolder\Backup-20230614.4.zip'
destinationFolder: 'C:\AdoBackupWorkspace'
cleanDestinationFolder: true
# Run import job
- task: ado-backup-tool-import@1
inputs:
source: 'https://dev.azure.com/solidifydemo'
target: 'https://dev.azure.com/solidifydemo'
sourceOrgName: 'solidifydemo'
targetOrgName: 'solidifydemo'
sourceProject: 'ContosoAir'
targetProject: 'ContosoAirMigrated'
targetUsername: 'john.doe@solidify.dev'
sourcePAT: '$(migrationToken)'
targetPAT: '$(migrationToken)'
onPrem: false
workspace: 'C:\AdoBackupWorkspace'
mappingFilePath: '$(System.DefaultWorkingDirectory)/mappings'
resourceWorkItem: true
env:
SYSTEM_ACCESSTOKEN: $(system.accesstoken)
Procedure for restoring your Azure DevOps organization or project
The following document contains a detailed procedure for restoring your Azure DevOps organization or project using the Azure DevOps Backup Tool Import task: https://github.com/solidify/azure-devops-backup-tool-docs/blob/main/restore.md.
As of today (2023-02-17), the Azure DevOps Backup Tool tasks are only supported on Windows 64bit systems. We plan on extending the support for Linux-based systems in the near future.
You will need to run the Export/Import task on either a self-hosted windows pool or pool: windows-latest
.
Software requirements on the build agent
Microsoft hosted agent vs Self-hosted agent
You may opt to use a Microsoft hosted build machine (pool: windows-latest
) or a self hosted build machine (self-hosted windows pool). The choice is essentially a tradeoff between data control and privacy versus ease of use and hosting/configuration.
Benefits and drawbacks of using a Microsoft hosted agent:
- (+) The agent is preconfigured with all dependencies, and no additional configuration is needed.
- (+) No self hosting needed.
- (-) The maximum runtime for a windows hosted agent is 60 minutes.
Benefits of using a self hosted agent:
- (+) Full control over the build environment.
- (+) Enables usage of the tool in a locked down network scenario.
- (+) No disclosure of data or metrics to any outside vendor or agent.
- (-) Requires some configuration
- (-) May require ongoing upgrades and maintenance of the agent + agent server
Due to the time constraint on Microsoft hosted agents, using a self hosted agent is necessary for larger organizations and backup jobs.
Dedicated ADO Project for backup jobs
The general recommendation is to use a separate Azure DevOps Project to store your backup pipelines and custom configurations. You may use an existing Project for other maintenance jobs, if you already have such a Project.
The motivation is that by about using a separate ADO project for your maintenance and backup jobs, you can restrict the access to these jobs, their logs, etc, by giving certain people access to the maintenance Project.
We offer a separate distribution of Azure DevOps Backup Tool as a Windows CLI application, for scenarios where your organization's compliance policies may not allow you run an Azure DevOps extension.
The CLI does not send any data to outside servers, and no analytics data are sent to Solidify or any other organization.
The Azure DevOps Backup Tool CLI is free for customers who have already purchased a license.
To obtain the CLI + documentation, contact us at support.adobackup@solidify.dev.
Installation
Install Azure DevOps Backup Tool to your Azure DevOps Organization.
Activate your 7 days free trial
Contact us at support.adobackup@solidify.dev to claim your free trial. You will receive a license file which you will need in order to proceed with the next step.
Before you get started, you must import import your .json license file into the extension's storage. If you have been in contact with us before, chances are you have already been given a license file. Otherwise contact us at support.adobackup@solidify.dev to inquire about purchasing a license.
The validity period of the license is 12 months, after which you must request a new license from us.
You will need to open the configuration page via Organization Settings -> Extensions -> Azure DevOps Backup Tool -> Enter License Manually. Paste the contents of the json file in its entirety.
Task parameters, explanation
Parameter |
Type |
Description |
Source URL |
string |
URL to the ADO organization/collection where the resources live. |
Target URL |
string |
URL to the ADO organization/collection where the resources will be restored. |
Organization/Collection |
string |
Name of the ADO Organization or Collection (same as URL) |
Project Name |
string |
Name of the project (supports glob patterns, for example * for all projects in an organization) |
User Name |
string |
E-mail/UniqueName of the user running the backup/migration job. |
PAT |
string |
Personal Access Token of the user (should be stored as a secret variable) |
On Prem? |
boolean |
'true' for ADO Server, 'false' for ADO Services |
Migration Workspace |
file path |
Path to the migration workspace. The json resources will be saved/loaded from this file path. |
Path to mapping files (Import task only) |
file path |
Path to the mapping files. See https://github.com/solidify/azure-devops-backup-tool-docs/blob/main/restore.md |
Use Custom Configuration Files? |
boolean |
Use Custom Configuration Files? (Advanced, see the next section) |
Path to custom configuration files |
file path |
Path to custom configuration files |
Migrate XXX |
boolean |
true/false for each resource types. Should the given resource be included in the backup/migration job or not? |
Advanced usage: use custom config files
You can use custom json configuration files to gain finer control over your backup/migration jobs.
Common scenarios for using custom config files include:
- Filtering git repositories by name (wildcard supported)
- Filtering build/release pipelines by name (wildcard supported)
- Filtering work items based on a WIQL query
- Overriding default git credentials for git repositories and wikis
You will need to supply the useCustomConfigurations and customConfigurationPath parameters in the task configuration, like this:
useCustomConfigurations: true
customConfigurationPath: '$(System.DefaultWorkingDirectory)/custom-configs'
This will configure the task to read your custom json configuration files from the $(System.DefaultWorkingDirectory)/custom-configs
directory.
Our recommendation is to store the config files in a git repository in the same project as the backup pipeline. You can either store the config files:
- in the same repository as the backup pipeline definitions (specify the correct folder)
- OR in a separate repository and checkout your configuration repository in the pipeline with the
resources
keyword.
Templates for custom configuration files for each adapter can be found here: https://github.com/solidify/azure-devops-backup-tool-docs.
You config files must follow the naming pattern config-[resource]-[export/import].json
. The following list shows all the valid config file names:
config-areapath-export.json
config-areapath-import.json
config-artifact-export.json
config-artifact-import.json
config-board-export.json
config-board-import.json
config-dashboard-export.json
config-dashboard-import.json
config-deploymentgroup-export.json
config-deploymentgroup-import.json
config-environment-export.json
config-environment-import.json
config-git-export.json
config-git-import.json
config-gitbranchpolicy-export.json
config-gitbranchpolicy-import.json
config-iterationpath-export.json
config-iterationpath-import.json
config-pipeline-export.json
config-pipeline-import.json
config-pullrequest-export.json
config-pullrequest-import.json
config-query-export.json
config-query-import.json
config-team-export.json
config-team-import.json
config-testplan-export.json
config-testplan-import.json
config-variablegroup-export.json
config-variablegroup-import.json
config-wiki-export.json
config-wiki-import.json
config-witprocess-export.json
config-witprocess-import.json
config-witprocessxml-export.json
config-witprocessxml-import.json
config-workitem-export.json
config-workitem-import.json
For configuration samples and more documentation on usage of custom configuration files and individual export/import adapters, visit https://github.com/solidify/azure-devops-backup-tool-docs.
Enable access to the OAuth token
For the pipeline to run, you must give it access to the Oauth token, otherwise you will receive errors like the following.
SYSTEM_ACCESSTOKEN env var not set
For Classic Pipelines: Simply go to: Edit Pipeline -> Agent Job 1 -> Additional Options, and ensure the "Allow scripts to access the OAuth token" checkbox is ticked. Then save the pipeline and run again.
For Yaml Pipelines: Pass the token to the task an environment variable:
env:
SYSTEM_ACCESSTOKEN: $(system.accesstoken)
Backup scenario, requirements and best practices
Running the task under a Service Account
For recurring backup jobs, we recommend running the backup as a service user that is detached from any individual employee's identity. The procedure for implementing a service user may vary between setups, but typically involves:
- Setting up a service user directly on the ADO organization/collection.
- OR setting up a service user in you Azure Active Directory
- Managing the service user's project permissions.
Service Account, required permissions
The service account should have Contributor access at minimum.
You will require a PAT with the following scopes depending on what resource types you are migration:
- Read (for export jobs)
- Write/Create/Manage (for import jobs)
Dedicated backup server
For recurring backup jobs, we recommend storing the backup data on a dedicated backup server. This can be achieved by:
- Spinning up a VM for backup purposes.
- Exposing the drive to the ADO Pipeline Agent VM as a network share.
- Mounting the network share as a drive on the ADO Pipeline Agent VM.
- Finally, in the task, specifying the Migration Workspace as a path on the mounted network drive.
Here is an example server architecture:
Server |
Platform |
Firewall opening |
(1) Azure DevOps Server |
Windows Server |
(2) |
(2) Pipeline Agent |
Windows Server |
(1) and (3) |
(3) Dedicated Backup Server |
Any |
(2) (via network share) |
In order to improve migration performance, you may consider using incremental backups.
With incremental backups enabled, Azure DevOps Backup Tool will figure out what resources have changed since the last backup, and only export the deltas. This leads to significant performance improvements for subsequent runs after the initial backup.
The recommended way to do incremental backups is to set your workspace
variable in the task configuration to a persistent file area on your build server. This area should ideally be located outside of the build agent directory. For example, use a directory path like C:\AdoBackupWorkspace
or C:\BackupWorkspace
. This method of doing incremental backups requires you you to own your build infrastructure.
The workflows that support incremental backups are:
- Git repositories
- Work Items
- Wikis
Other methods of doing incremental backups
- Using microsoft hosted build servers, it is possible to achieve the above workflow by first downloading the latest backup (ideally a .zip-file) from your backup server and extracting it into your workspace, prior to running the Export task. This method will add some overhead to your pipeline because of the download and extraction steps.
- Specifically for Work items, you can achieve a similar result by specifying the following WIT Query in your configuration, where the
System.ChangedDate
property is scripted to be the start time of the last migration job:
SELECT
*
FROM workitems
WHERE
[System.TeamProject] = @project
AND [System.ChangedDate] >= @StartOfDay-7
Granular resource selection
For some workflows, you may be even more granular when selecting what resources to export by using name patterns and queries. More information below.
In order to enable resource selection, you must use Custom configuration files. For more information, see the section Advanced usage: use custom config files in this document.
Name patterns
Name patterns will allow you to select only resources that conform to a given name pattern with wildcard support.
These workflows that support name patterns are:
- Git repositories
- Pipelines
Queries
WIT Queries will allow you to select only work items that conform to a given query.
This is only supported for Work Items.
Troubleshooting
Common error message: Invoke-RestMethod : The remote server returned an error: (401) Unauthorized
If you received the below error message in your ADO Pipeline log:
Invoke-RestMethod : The remote server returned an error: (401) Unauthorized.
The solution is usually to check the build configuration of your Restore pipeline and ensure that both the source and target credentials are set. Here is an example:
targetUsername: '$(sourceUsername)'
sourcePAT: '$(migrationToken)'
targetPAT: '$(migrationToken)'
Do you have questions, issues or a feature request?
For product inquiries and technical support related to Azure DevOps Backup Tool, reach out to support.adobackup@solidify.dev
For general inquiries, please get in touch using the Q&A section or our contact form: https://solidify.dev/contact