Tutorial: Import data to Blob Storage with Azure Import/Export service
This article provides step-by-step instructions on how to use the Azure Import/Export service to securely import large amounts of data to Azure Blob storage. To import data into Azure Blobs, the service requires you to ship encrypted disk drives containing your data to an Azure datacenter.
In this tutorial, you learn how to:
- Prerequisites to import data to Azure Blob storage
- Step 1: Prepare the drives
- Step 2: Create an import job
- Step 3: Configure customer managed key (Optional)
- Step 4: Ship the drives
- Step 5: Update job with tracking information
- Step 6: Verify data upload to Azure
Before you create an import job to transfer data into Azure Blob Storage, carefully review and complete the following list of prerequisites for this service. You must:
- Have an active Azure subscription that can be used for the Import/Export service.
- Have at least one Azure Storage account with a storage container. See the list of Supported storage accounts and storage types for Import/Export service.
- Have adequate number of disks of supported types.
- Have a Windows system running a supported OS version.
- Enable BitLocker on the Windows system. See How to enable BitLocker.
- Download the current release of the Azure Import/Export version 1 tool, for blobs, on the Windows system:
- Download WAImportExport version 1. The current version is 220.127.116.110.
- Unzip to the default folder
WaImportExportV1. For example,
- Have a FedEx/DHL account. If you want to use a carrier other than FedEx/DHL, contact Azure Data Box Operations team at
- The account must be valid, should have balance, and must have return shipping capabilities.
- Generate a tracking number for the export job.
- Every job should have a separate tracking number. Multiple jobs with the same tracking number are not supported.
- If you do not have a carrier account, go to:
Step 1: Prepare the drives
This step generates a journal file. The journal file stores basic information such as drive serial number, encryption key, and storage account details.
Perform the following steps to prepare the drives.
Connect your disk drives to the Windows system via SATA connectors.
Create a single NTFS volume on each drive. Assign a drive letter to the volume. Do not use mountpoints.
Enable BitLocker encryption on the NTFS volume. If using a Windows Server system, use the instructions in How to enable BitLocker on Windows Server 2012 R2.
Copy data to encrypted volume. Use drag and drop or Robocopy or any such copy tool. A journal (.jrn) file is created in the same folder where you run the tool.
If the drive is locked and you need to unlock the drive, the steps to unlock may be different depending on your use case.
If you have added data to a pre-encrypted drive (WAImportExport tool was not used for encryption), use the BitLocker key (a numerical password that you specify) in the popup to unlock the drive.
If you have added data to a drive that was encrypted by WAImportExport tool, use the following command to unlock the drive:
WAImportExport Unlock /bk:<BitLocker key (base 64 string) copied from journal (*.jrn*) file>
Open a PowerShell or command-line window with administrative privileges. To change directory to the unzipped folder, run the following command:
To get the BitLocker key of the drive, run the following command:
manage-bde -protectors -get <DriveLetter>:
To prepare the disk, run the following command. Depending on the data size, disk preparation may take several hours to days.
./WAImportExport.exe PrepImport /j:<journal file name> /id:session<session number> /t:<Drive letter> /bk:<BitLocker key> /srcdir:<Drive letter>:\ /dstdir:<Container name>/ /blobtype:<BlockBlob or PageBlob> /skipwrite
A journal file is created in the same folder where you ran the tool. Two other files are also created - an .xml file (folder where you run the tool) and a drive-manifest.xml file (folder where data resides).
The parameters used are described in the following table:
Option Description /j: The name of the journal file, with the .jrn extension. A journal file is generated per drive. We recommend that you use the disk serial number as the journal file name. /id: The session ID. Use a unique session number for each instance of the command. /t: The drive letter of the disk to be shipped. For example, drive
/bk: The BitLocker key for the drive. Its numerical password from output of
manage-bde -protectors -get D:
/srcdir: The drive letter of the disk to be shipped followed by
:\. For example,
/dstdir: The name of the destination container in Azure Storage. /blobtype: This option specifies the type of blobs you want to import the data to. For block blobs, the blob type is
BlockBloband for page blobs, it is
/skipwrite: Specifies that there is no new data required to be copied and existing data on the disk is to be prepared. /enablecontentmd5: The option when enabled, ensures that MD5 is computed and set as
Content-md5property on each blob. Use this option only if you want to use the
Content-md5field after the data is uploaded to Azure.
This option does not affect the data integrity check (that occurs by default). The setting does increase the time taken to upload data to cloud.
If you import a blob with the same name as an existing blob in the destination container, the imported blob will overwrite the existing blob. In earlier tool versions (before 18.104.22.1680), the imported blob was renamed by default, and a \Disposition parameter let you specify whether to rename, overwrite, or disregard the blob in the import.
Repeat the previous step for each disk that needs to be shipped.
A journal file with the provided name is created for every run of the command line.
Together with the journal file, a
<Journal file name>_DriveInfo_<Drive serial ID>.xmlfile is also created in the same folder where the tool resides. The .xml file is used in place of the journal file when creating a job if the journal file is too big.
- Do not modify the journal files or the data on the disk drives, and don't reformat any disks, after completing disk preparation.
- The maximum size of the journal file that the portal allows is 2 MB. If the journal file exceeds that limit, an error is returned.
Step 2: Create an import job
Perform the following steps to create an import job in the Azure portal.
Log on to https://portal.azure.com/.
Search for import/export jobs.
Select + New.
Select a subscription.
Select a resource group, or select Create new and create a new one.
Enter a descriptive name for the import job. Use the name to track the progress of your jobs.
- The name may contain only lowercase letters, numbers, and hyphens.
- The name must start with a letter, and may not contain spaces.
Select Import into Azure.
Select Next: Job details > to proceed.
In Job details:
Upload the journal files that you created during the preceding Step 1: Prepare the drives. If
waimportexport.exe version1was used, upload one file for each drive that you prepared. If the journal file size exceeds 2 MB, then you can use the
<Journal file name>_DriveInfo_<Drive serial ID>.xmlalso created with the journal file.
Select the destination Azure region for the order.
Select the storage account for the import.
The dropoff location is automatically populated based on the region of the storage account selected.
If you don't want to save a verbose log, clear the Save verbose log in the 'waimportexport' blob container option.
Select Next: Shipping > to proceed.
Select the carrier from the dropdown list. If you want to use a carrier other than FedEx/DHL, choose an existing option from the dropdown. Contact Azure Data Box Operations team at
email@example.com the information regarding the carrier you plan to use.
Enter a valid carrier account number that you have created with that carrier. Microsoft uses this account to ship the drives back to you once your import job is complete. If you do not have an account number, create a FedEx or DHL carrier account.
Provide a complete and valid contact name, phone, email, street address, city, zip, state/province and country/region.
Instead of specifying an email address for a single user, provide a group email. This ensures that you receive notifications even if an admin leaves.
Select Review + create to proceed.
In the order summary:
- Review the Terms, and then select "I acknowledge that all the information provided is correct and agree to the terms and conditions." Validation is then performed.
- Review the job information provided in the summary. Make a note of the job name and the Azure datacenter shipping address to ship disks back to Azure. This information is used later on the shipping label.
- Select Create.
Step 3 (Optional): Configure customer managed key
Skip this step and go to the next step if you want to use the Microsoft managed key to protect your BitLocker keys for the drives. To configure your own key to protect the BitLocker key, follow the instructions in Configure customer-managed keys with Azure Key Vault for Azure Import/Export in the Azure portal.
Step 4: Ship the drives
FedEx, UPS, or DHL can be used to ship the package to Azure datacenter. If you want to use a carrier other than FedEx/DHL, contact Azure Data Box Operations team at
- Provide a valid FedEx, UPS, or DHL carrier account number that Microsoft will use to ship the drives back.
- When shipping your packages, you must follow the Microsoft Azure Service Terms.
- Properly package yours disks to avoid potential damage and delays in processing.
Step 5: Update the job with tracking information
After shipping the disks, return to the Import/Export page on the Azure portal.
If the tracking number is not updated within 2 weeks of creating the job, the job expires.
To update the tracking number, perform the following steps.
- Select and click the job.
- Click Update job status and tracking info once drives are shipped.
- Select the checkbox against Mark as shipped.
- Provide the Carrier and Tracking number.
- Track the job progress on the portal dashboard. For a description of each job state, go to View your job status.
You can only cancel a job while it's in Creating state. After you provide tracking details, the job status changes to Shipping, and the job can't be canceled.
Step 6: Verify data upload to Azure
Track the job to completion. Once the job is complete, verify that your data has uploaded to Azure. Delete the on-premises data only after you have verified that the upload was successful. For more information, see Review Import/Export copy logs.