question

GovarthananVenkatesan-4333 avatar image
0 Votes"
GovarthananVenkatesan-4333 asked MartinJaffer-MSFT commented

How to Load Azure Data Factory Pipeline Details to Azure SQL Database for Tracking Purpose?

Hi Team,

Can anyone suggest, how to load Pipeline details to Azure SQL?

For Example, I want to load a pipeline details to Master Table(Info. like Pipeline Name, Pipeline Activities, Data factory Name., so on) and Child Table (Info. like Pipeline Id, Activity Details with Transformation Level, Created date., so on).

I've tried with PowerShell

Get-AzDataFactoryV2Pipeline

but it gives Master level data, but I need to load these data into Azure SQL database, as well as i need to get a entry whenever a new pipeline created and also pipeline deleted we need to delete that entry from both master and child tables.

Please help me on this request.



azure-data-factoryazure-data-lake-storage
· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hello @GovarthananVenkatesan-4333 and welcome to Microsoft Q&A.

I am a little confused as to the goal of what you are trying to setup.

Are you wanting to record the changes in pipeline definition, like source control (i.e. GIT).
or
Are you wanting to monitor and record the pipeline runs so you can take action on failure?

Thank you for your patience.

1 Vote 1 ·

Hi @MartinJaffer-MSFT thanks for your response.

I just want to log a pipeline definition changes into Azure SQL.

0 Votes 0 ·

For now I've tried this method to load a changes to Blob Storage.

 Get-AzDataFactoryV2Pipeline -ResourceGroupName "<ResourceGroupName>" -DataFactoryName "<DFName>" | Export-Csv ($home+"/clouddrive/"+"result.csv")
 $Context = New-AzureStorageContext -StorageAccountName "<StorageAccountName>" -StorageAccountKey "<StorageAccountKey>"
 Set-AzureStorageBlobContent -Context $Context -Container "111" -File ($home+"/clouddrive/"+"result.csv") -Blob "SavedFile.csv"

But this method is not straight forward, because we need to do one more load to load a changes into Azure SQL from Blob Storage





0 Votes 0 ·

1 Answer

MartinJaffer-MSFT avatar image
1 Vote"
MartinJaffer-MSFT answered MartinJaffer-MSFT commented

Thank you for sharing the details @GovarthananVenkatesan-4333 .

My colleague sent me a link to a blog where they connected to SQL using powershell and loading data. This would let you skip the blob step, and send the pipeline definition directly to the database.
However, you are using Azure SQL, so there may be extra steps, such as adjusting the firewall.

I had an idea about how to get the changes as they happen (using Event Grid), but I do not know if it can work. Azure Event Grid is outside my area of expertise. So far my idea is not working, but I probably just set it up wrong.


· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

@GovarthananVenkatesan-4333 did the link to the blog help you? Do you still need more assistance?

1 Vote 1 ·
MartinJaffer-MSFT avatar image MartinJaffer-MSFT GovarthananVenkatesan-2617 ·

Great! If it solved your issue, please mark as accepted answer and have a great new year!

0 Votes 0 ·