Transfer data with AzCopy v10 (Preview)

AzCopy v10 (Preview) is the command-line utility for copying data to or from Microsoft Azure Blob and File storage. AzCopy v10 offers a redesigned command-line interface, and new architecture for reliable data transfers. By using AzCopy, you can copy data between a file system and a storage account, or between storage accounts.

What's new in AzCopy v10

  • Synchronizes file systems to Azure Blob storage or vice versa. Use azcopy sync <source> <destination>. Ideal for incremental copy scenarios.
  • Supports Azure Data Lake Storage Gen2 APIs. Use myaccount.dfs.core.windows.net as a URI to call the Data Lake Storage Gen2 APIs.
  • Supports copying an entire account (Blob service only) to another account.
  • Uses the new Put Block from URL APIs to support account-to-account copy. The data transfer is faster, since transfer to the client isn't required.
  • Lists or removes files and blobs in a given path.
  • Supports wildcard patterns in a path, and --exclude flags.
  • Creates a job order and a related log file with every AzCopy instance. You can view and restart previous jobs, and resume failed jobs. AzCopy will also automatically retry a transfer after a failure.
  • Features general performance improvements.

Download and install AzCopy

Latest preview version (v10)

Download the latest preview version of AzCopy:

Latest production version (v8.1)

Download the latest production version of AzCopy for Windows.

AzCopy supporting Table storage service (v7.3)

Download the AzCopy v7.3 supporting copying data to/from Microsoft Azure Table storage service.

Post-installation steps

AzCopy v10 doesn't require an installation. Open your preferred command-line application and browse to the folder where azcopy.exe is located. If needed, you can add the AzCopy folder location to your system path for ease of use.

Authentication options

AzCopy v10 supports the following options when authenticating with Azure Storage:

  • Azure Active Directory (Supported for Blob and Data Lake Storage Gen2 services). Use .\azcopy login to sign in with Azure Active Directory. The user should have "Storage Blob Data Contributor" role assigned to write to Blob storage with Azure Active Directory authentication. For authentication via managed identities for Azure resources, use azcopy login --identity.
  • Shared access signature tokens [Supported for Blob and File services]. Append the shared access signature (SAS) token to the blob path on the command line to use it. You can generate SAS tokens with the Azure portal, Storage Explorer, PowerShell, or other tools of your choice. For more information, see examples.

Getting started

Tip

Prefer a graphical user interface?

Azure Storage Explorer, a desktop client that simplifies managing Azure Storage data, now uses AzCopy to accelerate data transfer to and out of Azure Storage.

Enable AzCopy in Storage Explorer under the Preview menu. Enable AzCopy as a transfer engine in Azure Storage Explorer

AzCopy v10 has a self-documented syntax. When you have logged in to Azure Active Directory, the general syntax looks like the following:

.\azcopy <command> <arguments> --<flag-name>=<flag-value>

# Examples if you have logged into the Azure Active Directory:
.\azcopy copy <source path> <destination path> --<flag-name>=<flag-value>
.\azcopy cp "C:\local\path" "https://account.blob.core.windows.net/container" --recursive=true
.\azcopy cp "C:\local\path\myfile" "https://account.blob.core.windows.net/container/myfile"
.\azcopy cp "C:\local\path\*" "https://account.blob.core.windows.net/container"

# Examples if you're using SAS tokens to authenticate:
.\azcopy cp "C:\local\path" "https://account.blob.core.windows.net/container?sastoken" --recursive=true
.\azcopy cp "C:\local\path\myfile" "https://account.blob.core.windows.net/container/myfile?sastoken"

Here's how you can get a list of available commands:

.\azcopy --help
# To use the alias instead
.\azcopy -h

To see the help page and examples for a specific command, run the following command:

.\azcopy <cmd> --help
# Example:
.\azcopy cp -h

Create a blob container or file share

Create a blob container

.\azcopy make "https://account.blob.core.windows.net/container-name"

Create a file share

.\azcopy make "https://account.file.core.windows.net/share-name"

Create a blob container by using Azure Data Lake Storage Gen2

If you've enabled hierarchical namespaces on your Blob storage account, you can use the following command to create a new blob container for uploading files.

.\azcopy make "https://account.dfs.core.windows.net/top-level-resource-name"

Copy data to Azure Storage

Use the copy command to transfer data from the source to the destination. The source or destination can be a:

  • Local file system
  • Azure Blob/Virtual Directory/Container URI
  • Azure File/Directory/File Share URI
  • Azure Data Lake Storage Gen2 Filesystem/Directory/File URI
.\azcopy copy <source path> <destination path> --<flag-name>=<flag-value>
# Using the alias instead 
.\azcopy cp <source path> <destination path> --<flag-name>=<flag-value>

The following command uploads all files under the folder C:\local\path recursively to the container mycontainer1, creating path directory in the container:

.\azcopy cp "C:\local\path" "https://account.blob.core.windows.net/mycontainer1<sastoken>" --recursive=true

The following command uploads all files under the folder C:\local\path (without recursing into the subdirectories) to the container mycontainer1:

.\azcopy cp "C:\local\path\*" "https://account.blob.core.windows.net/mycontainer1<sastoken>"

To find more examples, use the following command:

.\azcopy cp -h

Copy data between two storage accounts

Copying data between two storage accounts uses the Put Block From URL API, and doesn't use the client machine's network bandwidth. Data is copied between two Azure Storage servers directly, while AzCopy simply orchestrates the copy operation. This option is currently only available for Blob storage.

To copy the data between two storage accounts, use the following command:

.\azcopy cp "https://myaccount.blob.core.windows.net/<sastoken>" "https://myotheraccount.blob.core.windows.net/<sastoken>" --recursive=true

Note

This command will enumerate all blob containers and copy them to the destination account. At this time, AzCopy v10 supports copying only block blobs between two storage accounts. It will skip all other storage account objects (such as append blobs, page blobs, files, tables, and queues).

Copy a VHD image to a storage account

AzCopy v10 by default uploads data into block blobs. However, if a source file has a .vhd extension, AzCopy v10 will default to uploading to a page blob. At this time, this action isn't configurable.

Sync: incremental copy and delete (Blob storage only)

The sync command synchronizes contents of a source directory to a directory in the destination, comparing file names and last modified timestamps. This operation includes the optional deletion of destination files if those do not exist in the source when the --delete-destination=prompt|true flag is provided. By default, the delete behavior is disabled.

Note

Use the --delete-destination flag with caution. Enable the soft delete feature before you enable delete behavior in sync to prevent accidental deletions in your account.

When --delete-destination is set to true, AzCopy will delete files that do not exist in the source from destination without any prompt to the user. If you want to be prompted for confirmation, use --delete-destination=prompt.

To sync your local file system to a storage account, use the following command:

.\azcopy sync "C:\local\path" "https://account.blob.core.windows.net/mycontainer1<sastoken>" --recursive=true

You can also sync a blob container down to a local file system:

# The SAS token isn't required for Azure Active Directory authentication.
.\azcopy sync "https://account.blob.core.windows.net/mycontainer1" "C:\local\path" --recursive=true

This command incrementally syncs the source to the destination based on the last modified timestamps. If you add or delete a file in the source, AzCopy v10 will do the same in the destination. Before deletion, AzCopy will prompt you to confirm.

Advanced configuration

Configure proxy settings

To configure the proxy settings for AzCopy v10, set the environment variable https_proxy by using the following command:

# For Windows:
set https_proxy=<proxy IP>:<proxy port>
# For Linux:
export https_proxy=<proxy IP>:<proxy port>
# For MacOS
export https_proxy=<proxy IP>:<proxy port>

Optimize throughput

Set the environment variable AZCOPY_CONCURRENCY_VALUE to configure the number of concurrent requests, and to control the throughput performance and resource consumption. The value is set to 300 by default. Reducing the value will limit the bandwidth and CPU used by AzCopy v10.

# For Windows:
set AZCOPY_CONCURRENCY_VALUE=<value>
# For Linux:
export AZCOPY_CONCURRENCY_VALUE=<value>
# For MacOS
export AZCOPY_CONCURRENCY_VALUE=<value>
# To check the current value of the variable on all the platforms
.\azcopy env
# If the value is blank then the default value is currently in use

Change the location of the log files

You can change the location of the log files if needed or to avoid filling up the OS disk.

# For Windows:
set AZCOPY_LOG_LOCATION=<value>
# For Linux:
export AZCOPY_LOG_LOCATION=<value>
# For MacOS
export AZCOPY_LOG_LOCATION=<value>
# To check the current value of the variable on all the platforms
.\azcopy env
# If the value is blank, then the default value is currently in use

Change the default log level

By default, AzCopy log level is set to INFO. If you would like to reduce the log verbosity to save disk space, overwrite the setting using --log-level option. Available log levels are: DEBUG, INFO, WARNING, ERROR, PANIC, and FATAL.

Review the logs for errors

The following command will get all errors with UPLOADFAILED status from the 04dc9ca9-158f-7945-5933-564021086c79 log:

cat 04dc9ca9-158f-7945-5933-564021086c79.log | grep -i UPLOADFAILED

Troubleshooting

AzCopy v10 creates log files and plan files for every job. You can use the logs to investigate and troubleshoot any potential problems. The logs will contain the status of failure (UPLOADFAILED, COPYFAILED, and DOWNLOADFAILED), the full path, and the reason of the failure. The job logs and plan files are located in the %USERPROFILE\.azcopy folder on Windows or $HOME\.azcopy folder on Mac and Linux.

Important

When submitting a request to Microsoft Support (or troubleshooting the issue involving any third party), share the redacted version of the command you want to execute. This ensures the SAS isn't accidentally shared with anybody. You can find the redacted version at the start of the log file.

View and resume jobs

Each transfer operation will create an AzCopy job. Use the following command to view the history of jobs:

.\azcopy jobs list

To view the job statistics, use the following command:

.\azcopy jobs show <job-id>

To filter the transfers by status, use the following command:

.\azcopy jobs show <job-id> --with-status=Failed

Use the following command to resume a failed/canceled job. This command uses its identifier along with the SAS token. It isn't persistent for security reasons:

.\azcopy jobs resume <jobid> --sourcesastokenhere --destinationsastokenhere

Next steps

If you have questions, issues, or general feedback, submit them on GitHub.