Sometimes we need to move our local files, SQL scripts, backups from our local machine to Azure or vice versa. This can be done manually by accessing to Azure and using a browser, but there are other methods to automatically do this. This article describes the Microsoft Azure Storage Tool. This is a command line tool used to upload data to Azure from a local machine or to download data from Azure to our local machine. The article will describe step by step how to work with this tool.
- An Azure subscription.
- A local machine with Windows installed.
- It comes with the PowerShell installer, which can be downloaded here.
You will also require having a storage in Azure. For more information about creating a storage in Azure, refer to our article to create storage section.
In the storage, it is requited a container. For more information, refer to our article related to storage and containers.
The storage contains an option to manage access keys. This option will let you handle the keys.
Req3. The manage access keys
The access keys will be used to connect to azure from the AzCopy command line.
Req 4. The different access keys.
Once installed the PowerShell as specified, open the Microsoft Azure Storage command line.
Figure 1. The Microsoft Azure Storage command line
Let’s start copying files from the local machine to Azure. Imagine that we have the following folder with SQL notes and SQL Server scripts in our local machine:
Figure 2. The local folder
In order to copy the information from the local folder to our container created on the requirements section, use the following command:
AzCopy is the command to copy file(s) from/to Azure. It is similar to the copy command line in the cmd. It requires a source and a destination. In this example, source is the c:\scripts3 folder. The destination is the URL of the container (see the req 2 picture on requirements). The container will store the files of the scripts folder. Finally, the DestKey is the key to access to Azure. You can find the key in the req 4 picture of the requirements section.
Once you run the commands, you will have a transfer summary result similar to this one:
Figure 3. The AzCopy results
In order to verify, go to storage and click on the storage created on the requirements.
Figure 4. The storage
In the storage, go to the container created on the requirements.
You will find the files of the local folder of the Figure 2 copied to Azure.
In order to obtain more information about the azcopy command, write this commands at the command line:
The command will show information about the parameters and some useful examples.
Figure 7. The Azcopy help
Let’s try another example. In this example we will copy the myfile.txt from Azure to a local folder:
To source is the url of the azure storage container. The destination is the c:\test folder. This forder is empty. The Azure container contains multiple files. In this example we only one to copy the file myfile.txt. We will use the Pattern parameter for this purpose.
You will receive a message similar to this one after executing the command:
Figure 8. Downloading results
You will be able to see the file copied from azure to your local machine:
Figure 9. The local folder with the file copied from Azure
There are other parameters that the command line includes:
/S which means recursive mode and includes the subfolders.
/Y is used to supress the confirmation prompts
/L is used to list operations
/A is used to upload files with the Archive Attribute Set
/MT is used to keep the source modified date time
/XN excludes files if they are newer than the destination
/XO excludes the source files if they are older than the destination
/V:[verbose log-file] this parameter is used to save the output in a log file, by default the log file is in the %LocalAppData%\Microsoft\Azure\AzCopy
In this new example, we have a folder with multiple files. We want to copy only the PowerShell scripts:
Figure 10. Several files in a local folder
The following command will copy all the files with ps (PowerShell) extension and store the results in a log file:
/DestKey:pGFSbfuUMjDBQZqIbfJuwksVp5 == /Pattern:”*.Ps” /V
If everything is OK, you will receive a message similar to this one:
Figure 11. The copy results
You will also be able to see the results in a log file. The /V parameter is used to store the information in this folder %LocalAppData%\Microsoft\Azure\AzCopy.
You will also be able to see the the ps files copied in Azure.
Finally, you can execute the commands from a file. Let’s create a file with AzCopy commands named AzCommand.txt:
/DestKey:pGFSbfuUMjDBQZqIbfJuwksVp5BkD7CFv/GxvdrOwiWvAYGLc5D5J5ZKjtIpipb2djiaEmOX3QhExWVOHSC0sQ== /Pattern:”*.Ps” /V
Now, execute the txt file with the following command line:
The command will execute the txt content. If everything is OK you will have the following result:
You can also verify in Azure that the files were copied
Figure 14. The transfer summary
As you can see, the Azure Storage Tools can be used to automate some tasks using the command line. Once you have a Storage with a container with Azure, the process to upload or download files is very simple. We also learned different parameters and how to specify patterns to choose specific files.
This AzCopy command line is currently in the version 4.1 (Current Previous Version). The new versions may include new commands in the future. Make sure to have the last version.
For more information, refer to these links:
He has worked for the government, oil companies, web sites, magazines and universities around the world. Daniel also regularly speaks at SQL Servers conferences and blogs. He is also a writer for SQL Server training material for certification exams.
View all posts by Daniel Calbimonte
Latest posts by Daniel Calbimonte (see all)
- Functions and stored procedures comparisons in SQL Server - February 20, 2017
- Functions vs stored procedures in SQL Server - February 14, 2017
- How to generate random SQL Server test data using T-SQL - January 26, 2017