Copy activity failed when copying from GCS to Azure blob

_Esteban Bett 30 Reputation points
2024-04-04T13:38:44.1333333+00:00

I am having this error today when copying from google cloud storage to azure blob storage. The connection with GCS is ok, I can list remote files using get-metadata activity.

The copy activity is failing:

ErrorCode=UserErrorFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The file operation is failed, upload file failed at path: 'xxx'.,Source=mscorlib,''Type=System.ArgumentOutOfRangeException,Message=Specified argument was out of the range of valid values.

Parameter name: size,Source=Microsoft.Hadoop.Avro,'

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,481 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,713 questions
{count} votes

2 answers

Sort by: Most helpful
  1. KarishmaTiwari-MSFT 18,627 Reputation points Microsoft Employee
    2024-04-05T00:02:22.1933333+00:00

    @_Esteban Bett Thanks for posting your query on Microsoft Q&A.

    Here is this documentation to Copy data from Google Cloud Storage to Azure Storage by using AzCopy.
    Can you please share the steps you are following and where are you seeing the error?

    The error message mentions an issue related to file upload and the parameter “size.” Ensure that the file you’re trying to upload is within the valid size range. Check if there are any restrictions on the maximum file size for the target Azure Blob Storage container.

    Also, check https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-google-cloud#handle-differences-in-object-naming-rules

    0 comments No comments

  2. JohnyBenz 306 Reputation points
    2024-05-01T11:03:30.8933333+00:00

    This error indicates that there is a size limit being exceeded when copying files from Google Cloud Storage to Azure Blob Storage. A few things to check:

    Make sure the file/files you are trying to copy are not larger than the size limit for Azure Blob Storage blocks, which is 100 MB. Copies will fail if any single file exceeds this limit.

    Check if the total size of all files being copied in one operation exceeds the maximum request size, which is 4.75GB by default in Azure Data Factory. Splitting the copy into multiple batches may help.

    Ensure the Azure storage account and container being copied to support the correct block/page blob sizes needed for the files. General Purpose V2 accounts support up to 4.75TB block blobs.

    Try copying files one by one instead of all at once to narrow down if it's one specific large file causing issues.

    Check network speeds and any throttling - very large file copies over the internet can time out.

    Use a different copy activity like Azure Data Factory's "Copy Data" activity instead of the built-in "Copy Activity" for better error handling of large file transfers.

    0 comments No comments