question

KeaganJordan-4385 avatar image
0 Votes"
KeaganJordan-4385 asked HimanshuSinha-MSFT answered

Data Factory Copy Task Running Indefinitely

Good Day

I have a problem where I have an Excel .xlsx file in the Data Lake, the size of the file is 48MB.
I am trying to copy this file into an Azure SQL DB.
This task works for many other smaller files but when it gets to this file it just hangs without copying any rows across.

I double checked, all of the column names are correct and I am inserting into a staging table with only varchar columns so it cant fail on a data type issue.

I let the task run for over 16 hours without any progress, the throughput keeps on dropping to the point where it was being measured in Bytes.

Are there any know issues with excel files in the platform or has anyone experienced this problem before?

Thanks in advance.

azure-data-factoryazure-sql-database
· 4
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

If you opened the file yourself, is there any problem? (E.g. does it prompt for password/read only that kind of thing)

Can you try write it to another file to try eliminate which side is stalling?

0 Votes 0 ·

Hi, yes there is no issue with the file.
I have successfully imported many files of the same structure.

The strange thing is that this file did eventually import after about 10 failed runs of which I did not change anything and then after it was successful I tried to run it again to see if it would fail and it went back to failing.

0 Votes 0 ·

@KeaganJordan-4385 : As suggested , I tested and was able to work with big excel , using dataflow , Can you please try that out ?
Thanks
Himanshu

0 Votes 0 ·
Show more comments

1 Answer

HimanshuSinha-MSFT avatar image
1 Vote"
HimanshuSinha-MSFT answered

Hello @KeaganJordan-4385 ,
Thanks for the ask and using the Microsoft Q&A platform .
ADF is performing great while processing big excel file .I suggest you to please try one the following

Option-1: Use dataflow activity to move big excel file into other data store. In mapping dataflow, the excel has supported streaming read with few CPU/Memory consumption.
Option-2: manually convert/save big excel file as csv format, then use copy activity to move it.

Please do let me know how it goes .
Thanks
Himanshu
Please do consider clicking on "Accept Answer" and "Up-vote" on the post that helps you, as it can be beneficial to other community members

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.