question

KrishnaNageshKukkadapu-3854 avatar image
0 Votes"
KrishnaNageshKukkadapu-3854 asked dgreatjuan-3160 answered

Issue with Logging with Azure Data Factory

Hello,
I have copy activity in azure data factory to copy flat files from a file share to azure blob storage. I would like to record the logging of all the files which are being copied and the result and the data consistency check as well.

I tried enabling the data consistency check box and logging in the copy activity, but i was only able to visualize them but not able to record it for future use.

Can you please help how we can fix this.

Thanks
Krishna.

azure-data-factory
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

MartinJaffer-MSFT avatar image
0 Votes"
MartinJaffer-MSFT answered KrishnaNageshKukkadapu-3854 commented

I did find a way to write the output to a file. I made use of the "Additional columns" feature of copying delimitedText dataset. See below picture.

10555-patternwriteoutputdetail.jpg

After completion of the main copy activity, I grab the details I want from the output you shared, and put it in a string type variable.
Then I do another copy activity. This one takes an almost blank CSV file, and uses the "Additional colums" feature to add the variable as a new column, and write the combination to a new CSV file.

It is possible to skip the set variable step, and directly reference the details, but I find the set variable makes for easier debugging.
Which expression to use for getting the details depends upon what you want to capture.
If you wanted to capture only the data verification, it could look like @{activity('Copy data').output.dataConsistencyVerification}
You probably also want to capture the activityRunId or PipelineRunId @{activity('Copy data').activityRunId}




· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Thanks alot, it works.

Is there a way where we can also find list of files it copied.

0 Votes 0 ·
VaibhavChaudhari avatar image
0 Votes"
VaibhavChaudhari answered KrishnaNageshKukkadapu-3854 commented

In my opinion, logging functionality only logs the incompatible rows or files which were not accessible at source - in shorts - it logs files details that were skipped to copy.

By default, the Copy activity stops copying data and returns a failure when source data rows are incompatible with sink data rows. To make the copy succeed, you can configure the Copy activity to skip and log the incompatible rows and copy only the compatible data

https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-overview#fault-tolerance




If the response helped, do "Accept Answer" and upvote it - Vaibhav

· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hello Vaibhav,

Thanks for your reply. I am actually looking for storing the output of the pipeline/tasks in azure data factory to a file so that we can refer in future incase of any issues.
For example, I see this output as follows in a json format. is there way where i can store this below output to a file.

0 Votes 0 ·

10584-untitled.png



The attached is the output

0 Votes 0 ·
untitled.png (18.5 KiB)
dgreatjuan-3160 avatar image
0 Votes"
dgreatjuan-3160 answered

you can store this json info into a variable type array then pass it to a web app as a body. the web app url should be the url of the logic app url. Then in Logic app you can write it into a csv file and output to a blob.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.