question

NathanCarns-0092 avatar image
0 Votes"
NathanCarns-0092 asked NathanCarns-0092 commented

Data Factory Pipeline Update

Hi.

I have a Data Factory pipeline that has a MariaDB as a source database and an Azure SQL database as destination. I want to be able to add new tables from the source without having to recreate the entire pipeline. How would I do this? Tables and columns are being added to the MariaDB and I'm having to delete and recreate the pipeline every time. I'm also deleting the tables inside the SQL DB. Any way to empty the database from tables and views easily?

Please help with this.

Thanks.

azure-data-factoryazure-sql-database
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

RyanAbbey-0701 avatar image
0 Votes"
RyanAbbey-0701 answered NathanCarns-0092 commented

You'll need a pre-step within the pipeline that identifies the tables to pull, you then use a For Each to loop through the list of tables passing a table a time down to your transfer process

· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Thank you for the reply. What type of (activities) pre-step would best work in your opinion?

0 Votes 0 ·

Depends on what you're trying to do, is it every table within your database or just a predefined subset? If you do a google search for something like "data factory dynamic pipeline" there are a number of people who have blogged about approaches taken

1 Vote 1 ·

I want to extract and update all the tables from a MariaDB database. Every time a new table is uploaded, a new column is added or rows are appended, I want it to automatically update the source dataset (or worst case scenario, I update the activity manually somehow).

I'll look into dynamic pipelines. Thanks.

0 Votes 0 ·