question

Samy-7940 avatar image
0 Votes"
Samy-7940 asked ShaikMaheer-MSFT answered

Copying huge amount of data by using ADF

Hi All, I am sorry if this question has come up before. Basically, I want to copy huge amount of data that is billion of rows in to ADLS or Blob. Couple of workaround are segregate the data in to smaller blocks and copy it sequentially instead of all at once considering performance perspective. I would really appreciate if you could please let me know what other workarounds could be possible to achieve performance and consistency. Thanks in advance.

azure-data-factory
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

ShaikMaheer-MSFT avatar image
0 Votes"
ShaikMaheer-MSFT answered

Hi @Samy-7940 ,

Thank you for posting your query in Microsoft Q&A Platform.

Below are few useful recommendations from Microsoft while handling bulk data. Kindly check them. Thank you.

Kindly go with any of above suggested approaches based on your source and sink types and based on ETL needs.

Hope this will help. Please let us know if any further queries. Thank you.


  • Please accept an answer if correct. Original posters help the community find answers faster by identifying the correct answer. Here is how.

  • Want a reminder to come back and check responses? Here is how to subscribe to a notification.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.