Transfer data with the AzCopy v10 (Preview)
AzCopy v10 (Preview) is the next-generation command-line utility for copying data to/from Microsoft Azure Blob and File storage, which offers a redesigned command-line interface and new architecture for high-performance reliable data transfers. Using AzCopy you can copy data between a file system and a storage account, or between storage accounts.
What's new in AzCopy v10
- Synchronize a file system to Azure Blob or vice versa. Use
azcopy sync <source> <destination>. Ideal for incremental copy scenarios.
- Supports Azure Data Lake Storage Gen2 APIs. Use
myaccount.dfs.core.windows.netas a URI to call the ADLS Gen2 APIs.
- Supports copying an entire account (Blob service only) to another account.
- Account to account copy is now using the new Put from URL APIs. No data transfer to the client is needed which makes the transfer faster!
- List/Remove files and blobs in a given path.
- Supports wildcard patterns in a path as well as --include and --exclude flags.
- Improved resiliency: every AzCopy instance will create a job order and a related log file. You can view and restart previous jobs and resume failed jobs. AzCopy will also automatically retry a transfer after a failure.
- General performance improvements.
Download and install AzCopy
Latest preview version (v10)
Download the latest preview version of AzCopy:
Latest production version (v8.1)
Download the latest production version of AzCopy for Windows.
AzCopy supporting Table storage service (v7.3)
AzCopy v10 does not require an installation. Open a preferred command-line application and navigate to the folder where the
azcopy.exe executable is located. If desired, you can add the AzCopy folder location to your system path.
AzCopy v10 allows you to use the following options when authenticating with Azure Storage:
- Azure Active Directory [Supported on Blob and ADLS Gen2]. Use
.\azcopy loginto sign in using Azure Active Directory. The user should have "Storage Blob Data Contributor" role assigned to write to Blob storage using Azure Active Directory authentication.
- SAS tokens [Supported on Blob and File service]. Append the SAS token to the blob path on the command line to use it. You can generate SAS token using Azure Portal, Storage Explorer, PowerShell, or other tools of your choice. For more information, see examples.
AzCopy v10 has a simple self-documented syntax. The general syntax looks as follows when logged into the Azure Active Directory:
.\azcopy <command> <arguments> --<flag-name>=<flag-value> # Examples if you have logged into the Azure Active Directory: .\azcopy copy <source path> <destination path> --<flag-name>=<flag-value> .\azcopy cp "C:\local\path" "https://account.blob.core.windows.net/container" --recursive=true .\azcopy cp "C:\local\path\myfile" "https://account.blob.core.windows.net/container/myfile" .\azcopy cp "C:\local\path\*" "https://account.blob.core.windows.net/container" # Examples if you are using SAS tokens to authenticate: .\azcopy cp "C:\local\path" "https://account.blob.core.windows.net/container?sastoken" --recursive=true .\azcopy cp "C:\local\path\myfile" "https://account.blob.core.windows.net/container/myfile?sastoken"
Here's how you can get a list of available commands:
.\azcopy -help # Using the alias instead .\azcopy -h
To see the help page and examples for a specific command run the command below:
.\azcopy <cmd> -help # Example: .\azcopy cp -h
Create a Blob container or File share
Create a Blob container
.\azcopy make "https://account.blob.core.windows.net/container-name"
Create a File share
.\azcopy make "https://account.file.core.windows.net/share-name"
Create a Blob container using ADLS Gen2
If you've enabled hierarchical namespaces on your blob storage account, you can use the following command to create a new file system (Blob container) so that you can upload files to it.
.\azcopy make "https://account.dfs.core.windows.net/top-level-resource-name"
Copy data to Azure Storage
Use the copy command to transfer data from the source to the destination. The source/destination can be a:
- Local file system
- Azure Blob/Virtual Directory/Container URI
- Azure File/Directory/File Share URI
- Azure Data Lake Storage Gen2 Filesystem/Directory/File URI
.\azcopy copy <source path> <destination path> --<flag-name>=<flag-value> # Using alias instead .\azcopy cp <source path> <destination path> --<flag-name>=<flag-value>
The following command uploads all files under the folder
C:\local\path recursively to the container
path directory in the container:
.\azcopy cp "C:\local\path" "https://account.blob.core.windows.net/mycontainer1<sastoken>" --recursive=true
The following command uploads all files under the folder
C:\local\path (without recursing into the subdirectories) to the container
.\azcopy cp "C:\local\path\*" "https://account.blob.core.windows.net/mycontainer1<sastoken>"
To get more examples, use the following command:
.\azcopy cp -h
Copy data between two storage accounts
Copying data between two storage accounts uses the Put Block From URL API and does not utilize the client machine's network bandwidth. Data is copied between two Azure Storage servers directly while AzCopy simply orchestrates the copy operation. This option currently is only available for Blob storage.
To copy the data between two storage accounts, use the following command:
.\azcopy cp "https://myaccount.blob.core.windows.net/<sastoken>" "https://myotheraccount.blob.core.windows.net/<sastoken>" --recursive=true
The command will enumerate all blob containers and copy them to the destination account. At this time AzCopy v10 supports copying only block blobs between two storage accounts. All other storage account objects (append blobs, page blobs, files, tables and queues) will be skipped.
Copy a VHD image to a storage account
AzCopy v10 by default uploads data into block blobs. However, if a source file has vhd extension, AzCopy v10 will by default upload it to a page blob. This behavior currently isn't configurable.
Sync: incremental copy and delete (Blob storage only)
Sync command synchronizes contents from source to destination and this includes DELETION of destination files if those do not exist in the source. Make sure you use the destination you intend to synchronize.
To sync your local file system to a storage account, use the following command:
.\azcopy sync "C:\local\path" "https://account.blob.core.windows.net/mycontainer1<sastoken>" --recursive=true
In the same way you can sync a Blob container down to a local file system:
# If you're using Azure Active Directory authentication the sastoken is not required .\azcopy sync "https://account.blob.core.windows.net/mycontainer1" "C:\local\path" --recursive=true
The command allows you to incrementally sync the source to the destination based on last modified timestamps. If you add or delete a file in the source, AzCopy v10 will do the same in the destination. Before deletion, AzCopy will prompt to confirm the deletion of the files.
Configure proxy settings
To configure the proxy settings for AzCopy v10, set the environment variable https_proxy using the following command:
# For Windows: set https_proxy=<proxy IP>:<proxy port> # For Linux: export https_proxy=<proxy IP>:<proxy port> # For MacOS export https_proxy=<proxy IP>:<proxy port>
Set the environment variable AZCOPY_CONCURRENCY_VALUE to configure the number of concurrent requests and control the throughput performance and resource consumption. The value is set to 300 by default. Reducing the value will limit the bandwidth and CPU used by AzCopy v10.
# For Windows: set AZCOPY_CONCURRENCY_VALUE=<value> # For Linux: export AZCOPY_CONCURRENCY_VALUE=<value> # For MacOS export AZCOPY_CONCURRENCY_VALUE=<value>
AzCopy v10 creates log files and plan files for all the jobs. You can use the logs to investigate and troubleshoot any potential problems. The logs will contain the status of failure (UPLOADFAILED, COPYFAILED, and DOWNLOADFAILED), the full path, and the reason of the failure. The job logs and plan files are located in the %USERPROFILE\.azcopy folder.
Review the logs for errors
The following command will get all errors with UPLOADFAILED status from the 04dc9ca9-158f-7945-5933-564021086c79 log:
cat 04dc9ca9-158f-7945-5933-564021086c79.log | grep -i UPLOADFAILED
View and resume jobs
Each transfer operation will create an AzCopy job. You can view the history of jobs using the following command:
.\azcopy jobs list
To view the job statistics, use the following command:
.\azcopy jobs show <job-id>
To filter the transfers by status, use the following command:
.\azcopy jobs show <job-id> --with-status=Failed
You can resume a failed/cancelled job using its identifier along with the SAS token (it is not persistent for security reasons):
.\azcopy jobs resume <jobid> --sourcesastokenhere --destinationsastokenhere
Change the default log level
By default AzCopy log level is set to INFO. If you would like to reduce the log verbosity to save disk space, overwrite the setting using
--log-level option. Available log levels are: DEBUG, INFO, WARNING, ERROR, PANIC, and FATAL
Your feedback is always welcomed. If you have any questions, issues or general feedback submit them at https://github.com/Azure/azure-storage-azcopy. Thank you!