question

panditarindam avatar image
0 Votes"
panditarindam asked panditarindam commented

ADF Dataflow out of memory error

Getting the following error. Loading 215 M rows as CSV format from Data Lake Gen 2 to Data Lake Gen 2.

Using memory optimized cluster with 8 core and 10 round robin partition, CSV file size two files 16 GB and 11 GB.

Error message:

Error: Status Code DF execute user error: Job Failed due to reason, unable to acquire 732 bytes of memory got 0.

azure-data-factory
· 5
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Are you getting the error during Data Preview, pipeline debug, or pipeline trigger?

1 Vote 1 ·

We are getting it during debug run. We increased it to 16 Core, still it is failing in Debug run.

0 Votes 0 ·

Hi Mark Kromer, we ran it in Debug and it fails but it runs successfully, in trigger run. We are trying to understand why do it fail in Debug. Any help will be highly appreciated.



0 Votes 0 ·

When you execute a data flow activity in a pipeline in debug mode, you are not using the activity settings for compute. The "Run on" Azure IR setting the activity is only honored during pipeline triggered runs. Debug sessions using the Azure IR associated with your debug settings, not the activity IR.

You can test your activity in pipeline debug mode with a larger compute size by either (a) using a larger Azure IR for your debug session, or (b) Click Debug > Use activity runtime. That will execute a true test of your data flow activity.

0 Votes 0 ·
Show more comments

1 Answer

Kiran-MSFT avatar image
0 Votes"
Kiran-MSFT answered panditarindam commented

It is a bad practice to partition for file sources. You are trying to fit 215M * rowsize / 10 into 1 node in a cluster. Remove the partitioning if you are using tiny clusters.

· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

We have tried it with No-partition, that is, Current Partition. Still it failed. Interestingly when we run it on trigger it runs successfully.

We are trying to understand why it fail in debug run.

0 Votes 0 ·