Rajendra Gupta
High-level workflow for database export

Azure Automation: Export Azure SQL Database to Blob Storage in a BACPAC file

May 5, 2021 by

Azure SQL Database supports exporting schema and data in a compressed format BACPAC. We can store this BACPAC file into an Azure storage account. Later, this BACPAC file can help restore the Azure SQL database, and Azure managed SQL instance or on-premises SQL Server.

The below image shows the high-level workflow for database export for Azure SQL DB.

High-level workflow for database export

You can export the SQL database in the following ways:

  • Using the Azure portal
  • SQLPackage utility
  • PowerShell scripts
  • SQL Server Management Studio
  • Azure CLI

Suppose you want to export your database regularly in the azure storage. If you follow the manual approach, it would be a tedious and repetitive task. Therefore, we should always look for automation opportunities.

In the previous articles of the Azure automation series, we looked at the following articles.

  • Automate Pause and Resume of Azure Analysis Services
  • Automation runbook to create a transactionally consistent copy of the Azure database

In this article, we use automation runbooks to do the following tasks.

  • Export Azure SQL database to azure storage account at a specific schedule
  • Remove the expired (older than 7 days) exports from the storage account automatically

Requirements

To work with this article, you need the following environment setup.

  • Azure SQL Database: You require a running instance of the Azure database. You can refer to this article, Create Azure SQL Database using Azure PowerShell.
  • Azure Automation Account: You require an automation account to create the import modules, create runbooks, publish and schedule them. Reference articles:

Create azure automation runbooks for export Azure SQL database

In the Azure portal, navigate to your automation account and create a new PowerShell Runbook. Here, I have created a runbook named [azureexportautomation].

automation account

Now, before we write the PowerShell script in the runbook, we do the following.

Create automation variables

We can define variables in two ways in the runbook.

  • Create mandatory parameters in the PowerShell script. In the article, Azure Automation: Create a database copy of Azure SQL Database, we defined a mandatory parameter for entering the new database name using the below script.

  • Define azure automation variables: If your input values remain the same, it is advisable to use automation variables. You can define a variable along with its value. Later, you can fetch the variable value directly in the script. It avoids you entering the value for each parameter repeatedly.

For this article, I have defined the following automation variables.

  • sqlserver: It stores the azure server name
  • database: It has an azure SQL DB name
  • resourcegroupname: In this variable, we store the resource group for our azure database
  • storageaccountname: Here, we store our Azure storage account name
  • BlobStorageEndpoint: It is a unique namespace that has azure storage endpoint information
  • BlobContainerName: It is the blob container name in your azure storage account
  • accesskey: It stores the access key for connecting with a storage account

You can refer to the article, azure storage account for the configuration of a blob container.

Create automation variables

You can get these values from your azure portal.

Variable value

PowerShell script to connect with the Azure

In the initial part, we create an AzureLogin() function and specify the connection name as AzureRunAsConnection. It uses the cmdlet Get-AutomationConnection and Connect-AzAccount for connecting with azure resources using azure automation.

The Connect-AzAccount cmdlet is in Az.Accounts PowerShell module. If you have not imported it in the azure account, you can click on browse gallery, search module and import it.

Let’s click on Test Pane before we move forward and validate that the PowerShell function is working accurately. It returns the environment details with your custom messages. It shows that the PowerShell function is working fine.

PowerShell script to connect with the Azure

Fetch Azure automation variables values in the runbook script

In this step, we fetch automation variables and their values in the PowerShell script variables. To fetch the values, we use the cmdlet Get-AutomationVariable and fetch variables $SQLServerName, $database, $resourcegroupname, $storageaccountname, $BlobStorageEndpoint, $BlobContainerName and $accesskey.

To check the values, we print the value of the variable using the write-output command. As shown below, it displays all variables values in the PowerShell script.

Fetch Azure automation variables

PowerShell script to export Azure SQL database into Azure storage container

In this central part of the script, we use PowerShell cmdlet to export the Azure SQL database into the azure storage container.

The below script does the following tasks.

  • First, we define the naming convention for the BACPAC export of the azure database. In this article, we define the export file format as AzureDBName.ddMMyyyyhhss.bacpac

  • These BACPAC files will be stored in the blob container of the azure storage account. Therefore, we combine the azure blob storage endpoint, blob container name and BACPAC file name. The variable $bacpac defines the full path of the BACPAC file.

  • To verify the export file name and path, we print them using the write-output command.

  • Here, we have hardcoded the azure SQL database username and password. Usually, you can use either the PowerShell script parameter or the automation variable for credentials. For SQL user password, we use ConvertTo-SecureString for converting plain text password into a secure string. The variable $SecurePassword stores this secure string.

  • For azure database export, we use cmdlet New-AzSqlDatabaseExport and specify the following parameters for its configurations.
    • -ResourceGroupName: In this article, we have a resource group in the parameter $resourceGroupName, specify this variable for this
    • -ServerName: It is the azure SQL server name. We specify parameter $AzuredatabaseServerName for its value
    • -DatabaseName: It is the azure database name. We already have the database name in the $azuredatabaseName variable
    • -StorageKeyType: In this article, we use access keys for connecting with the azure storage account. Therefore, we specify the value StorageAccessKey here
    • -Storagekey: It is the access key for connecting with the storage account. Previously, we stored this access key in the parameter $storagekey, specify it in the script
    • -Storageuri: In this parameter, we specify the full path of the BACPAC file. In my case, we already generated a BACPAC file path in the variable $bacpac
    • -AdministratorLogin: It is the SQL admin user for the Azure SQL database
    • -AdministatorLoginPassword: In this parameter, we cannot specify the simple text password for login. Therefore, we specify the secure string using the $securepassword parameter

Check the status of Azure database export to Blob Storage

The database export might take longer depending upon your database size and storage. Therefore, we can use the cmdlet Get-AzSqlDatabaseImportExportStatus. It uses the $exportRequest variable’s operation status link and checks the status using a while loop. If the export operation is in progress, it sleeps for 10 seconds and checks for its status. It prints the status of the BACPAC file export.

Clear the old database exports from the Azure storage container

Suppose we want to remove all old database exports (BACPAC files) from the Azure storage container. We might have single or multiple BACPAC files in a container depending upon the frequency we take database exports.

Therefore, to remove the old blobs, we do the following things.

  • Use the cmdlet New-AzStorageContext to create azure storage context. It requires storage account name ( $Storageaccountname) and storage access key ( $storagekey) to set storage context
  • It uses another cmdlet Get-AzStorageBlob to get a list of blobs (BACPAC files) from the container
  • Next, we use a foreach loop to remove the blobs using the Remove-AzStorageBlob PowerShell cmdlet. In this command, we specify storage context and blob container name parameters

Note: This step requires cmdlets (New-AzStorageContext, Get-AzStorageBlob, Remove-AzStorageBlob) from the Az.Storage module. If you have not imported it earlier, you can browse the module gallery in the azure automation account and Import it.

PowerShell script to export azure SQL database

Execute Runbooks using Test-Pane and verify the azure database export

We have completed the PowerShell script for exporting the Azure SQL database to blob storage. Here, before we execute the runbook, you can see a sample BACPAC file from my previous run.

Execute the Runbook using Test-Pane

Once we execute the script, it should remove this BACPAC file and place a new database export. In the runbook, click on Test-Pane and Start.

remove this BACPAC file

You can go to the Azure SQL database dashboard and activity log for viewing BACPAC export logs. As shown below, it shows status as Accepted > Started >Succeeded.

activity log for viewing BACPAC export

As shown below, It removed older database exports and created new export in a storage container.

older database exports

Complete PowerShell script for azure database export

Previously, We have covered PowerShell script in small chunks. You can use the below script for preparing your runbook and configure it with your custom variables and parameters.

Publish and schedule the runbook

Once you have tested the PowerShell script, publish the runbook and link it to an automation account schedule. It executes the runbook on its schedule, removes older exports from the storage container and takes a new database export.

Click on Publish.

Publish Runbook

In the runbook dashboard, its status changes to Published.

Status change

You can click on the Start button to execute this runbook.

Click on the Start button

While the runbook executes, its status is Running.

Status is Running

Once runbook execution completes, click on the Output tab and it displays custom outputs we specified in the script.

View output

In the below image, we have a new SQL database export that is created by runbook execution.

Database export

Conclusion

This article explored the azure automation for exporting an Azure SQL Database into a storage container. The automation helps you schedule export at your convenience, and it does not require any manual intervention. You can use the BACPAC export in restoring the database on both on-premise and Azure SQL Server. If you plan to delete the azure database for cost-saving, you should export it for future reference.

Table of contents

Azure Automation: Export Azure SQL Database to Blob Storage in a BACPAC file
Azure Automation: Create database copies of Azure SQL Database
Azure Automation: Automate Azure SQL Database indexes and statistics maintenance
Azure Automation: Automate Pause and Resume of Azure Analysis Services
Azure Automation: Automate data loading from email attachments using Azure Logic Apps
Azure Automation: Building approval-based automated workflows using Azure Logic Apps
Azure Automation: Auto-scaling Azure SQL database with Azure Logic Apps
Azure Automation: Analyzing Twitter sentiments using Azure Logic Apps
Azure Automation: Use Azure Logic Apps to import data into Azure SQL Database from Azure Blob Storage
Azure Automation: Publish LinkedIn posts and tweets automatically using Azure Logic Apps
Azure Automation: Export Azure SQL database data to CSV files using Azure Logic Apps
Azure Automation: Translate documents into different languages using Cognitive Services with Azure Logic Apps
Azure Automation: Azure Logic Apps for face recognition and insert its data into Azure SQL Database

Rajendra Gupta
Azure, SQL Azure

About Rajendra Gupta

As an MCSA certified and Microsoft Certified Trainer in Gurgaon, India, with 13 years of experience, Rajendra works for a variety of large companies focusing on performance optimization, monitoring, high availability, and disaster recovery strategies and implementation. He is the author of hundreds of authoritative articles on SQL Server, Azure, MySQL, Linux, Power BI, Performance tuning, AWS/Amazon RDS, Git, and related technologies that have been viewed by over 10m readers to date. He is the creator of one of the biggest free online collections of articles on a single topic, with his 50-part series on SQL Server Always On Availability Groups. Based on his contribution to the SQL Server community, he has been recognized with various awards including the prestigious “Best author of the year" continuously in 2020 and 2021 at SQLShack. Raj is always interested in new challenges so if you need consulting help on any subject covered in his writings, he can be reached at rajendra.gupta16@gmail.com View all posts by Rajendra Gupta

881 Views