I want to send synapse Data Factory pipeline logs to data lake/azure storage. so that I don't have to login to synapse and go to individual pipeline jobs to check the logs.
the specific logs I am talking about are below:
I want to send synapse Data Factory pipeline logs to data lake/azure storage. so that I don't have to login to synapse and go to individual pipeline jobs to check the logs.
the specific logs I am talking about are below:
Apologies for the delay in response. As per my analysis you can find/monitor the information about completed/ended Apache Spark applications (log emitted by Apache Spark pools) by using the Log Analytics table: SynapseBigDataPoolApplicationsEnded

Ref doc: Azure Synapse Analytics - Apache Spark pool log
To know more about the table please refer to this doc - Azure Monitor - SynapseBigDataPoolApplicationsEnded.
In addition to above info, also noticed that only Microsoft Support and the Azure Synapse Analytics engineering team can view and download diagnostic logs that are associated with your Apache Spark pools. Ref doc: Apache Spark diagnostic logs in Azure Synapse Analytics
Hope this info helps.
As per recent conversation with product team, today, the Spark logs you see in the Spark application details monitoring view are not configurable within our Diagnostic Settings (Azure Monitor, to send logs to blob storage, for example). However, this capability is in our roadmap as a future improvement.
I would recommend you to please keep an eye on Azure updates for new feature release notifications or other product updates.
And if you have any additional feedback, I would recommend you to please submit it in Azure Synapse Analytics user voice forum :https://feedback.azure.com/forums/307516-azure-synapse-analytics and do share the link here as it would help other user with similar feedback to up-vote and comment on it which would help increase the priority of feature implementation.
Hope this info helps. Do let us know if you have further query.
Thank you
Please don’t forget to Accept Answer and Up-Vote wherever the information provided helps you, this can be beneficial to other community members.
Thanks for using Microsoft Q&A forum and posting your query.
II don't see any option directly from UI to route the logs to Storage/ADLS Gen2. We are reaching out to internal team to check if there is a way and will get back to you as soon as we have an update from the team.
Also could you please confirm if you are asking regarding pipeline run logs or Apache spark application (Jobs) logs? In case if your ask is to collect data regarding Synapse pipeline runs/Activtiy runs/Trigger Runs then you can use the diagnostic settings option from Azure portal. You can diagnostic logs for the below metrics to either storage account, Event hubs, Log analytics workspace. Please see below image for more info:

Just checking to see if you have got a chance to see my previous response. If so could you please confirm the details requested.
Thank you
Hi there,
We still have not heard back from you. Just wanted to check if you still need assistance on this issue. In case if you have already found a solution, would you please share it here with community. Otherwise please let us know.
Thank you
I need Apache spark application (Jobs) logs. In short, what I want is - whatever print/log statement I have written in code inside pyspark and scala notebook should be sent to me via mail/teams/etc..
7 people are following this question.