How to Load Azure Data Factory Pipeline Details to Azure SQL Database for Tracking Purpose?

Govarthanan Venkatesan 66 Reputation points
2020-12-20T16:41:19.617+00:00

Hi Team,

Can anyone suggest, how to load Pipeline details to Azure SQL?

For Example, I want to load a pipeline details to Master Table(Info. like Pipeline Name, Pipeline Activities, Data factory Name., so on) and Child Table (Info. like Pipeline Id, Activity Details with Transformation Level, Created date., so on).

I've tried with PowerShell

Get-AzDataFactoryV2Pipeline

but it gives Master level data, but I need to load these data into Azure SQL database, as well as i need to get a entry whenever a new pipeline created and also pipeline deleted we need to delete that entry from both master and child tables.

Please help me on this request.

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,351 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,614 questions
{count} votes

1 answer

Sort by: Most helpful
  1. MartinJaffer-MSFT 26,031 Reputation points
    2020-12-23T00:53:42.863+00:00

    Thank you for sharing the details @Govarthanan Venkatesan .

    My colleague sent me a link to a blog where they connected to SQL using powershell and loading data. This would let you skip the blob step, and send the pipeline definition directly to the database.
    However, you are using Azure SQL, so there may be extra steps, such as adjusting the firewall.

    I had an idea about how to get the changes as they happen (using Event Grid), but I do not know if it can work. Azure Event Grid is outside my area of expertise. So far my idea is not working, but I probably just set it up wrong.

    1 person found this answer helpful.