Hi team,
I am getting below error while copying data from Amazon S3 to Azure Blob. The source is XML file and it is zipped and I want to copy the file without unzipping to the target.
The size of the file when zipped is 77MB. I am trying to write it to an Json file in the target.
The integration run time is self hosted and the Max concurrent connection ,DIU is set as Default. I was able to copy smaller files with the same setting.
Failure happened on 'Sink' side. ErrorCode=SystemErrorOutOfMemory,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The available memory of the Integrated Runtime (self-hosted) is too small, please increase your machine memory.,Source=Microsoft.DataTransfer.TransferTask,''Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A task failed with out of memory.,Source=,''Type=System.OutOfMemoryException,Message=Exception of type 'System.OutOfMemoryException' was thrown.,Source=mscorlib,'
Source
Could you please let me know a work around to resolve this issue.
I have one more doubt. If my source from Amazon S3 is XML, can I save this file as XML in my target Azure blob by using copy activity?
I tried to use a XML sink data set ,but it is only accepting Json as target sink while copying XML source from S3
Thank you.