question

CharbelDaiaMartins-0878 avatar image
0 Votes"
CharbelDaiaMartins-0878 asked PRADEEPCHEEKATLA-MSFT commented

How to orchestrate pipelines and notebook?

Hi,


What would be the best practice for me to schedule pipelines created in different ADF and notebooks created in Data Bricks.

for example:

Run pipeline1 which is in ADF_A -> Run pileline1 which is in ADF_B -> Run notebook1 which is in Data Bricks

what would be the best form/resource within azure.


Thanks

azure-data-factoryazure-databricks
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

PRADEEPCHEEKATLA-MSFT avatar image
1 Vote"
PRADEEPCHEEKATLA-MSFT answered PRADEEPCHEEKATLA-MSFT commented

Hello @CharbelDaiaMartins-0878,

Thanks for the question and using the MS Q&A platform.

There are multiple ways of doing it.

  • Using Control flow activity - Execute Pipeline activity allows a Data Factory pipeline to invoke another pipeline.

  • Using Activity Dependency defines how subsequent activities depend on previous activities, determining the condition of whether to continue executing the next task. An activity can depend on one or multiple previous activities with different dependency conditions.

  • Using Tumbling window trigger dependency - You can create dependent pipelines in your ADF by adding dependencies among tumbling window triggers in your pipelines. By creating a dependency, you’re able to guarantee that a trigger is executed only after the successful execution of a dependent trigger in your data factory.

Hope this helps. Do let us know if you any further queries.


Please "Accept the answer" if the information helped you. This will help us and others in the community as well.

· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.


Thanks, for the ansewer.

Assuming I only have access to ADF_B while ADF_A is managed by another team and i don't have access,

would I orchestrate with another datafactory for example ADF_C? or another resource other than Data Factory , logic app , azure functions ..

Thanks

0 Votes 0 ·

Hello @CharbelDaiaMartins-0878,

If you would like to run pipeline in another ADF, from another ADF, then you might need to issue a REST API call for same.

https://docs.microsoft.com/en-us/rest/api/datafactory/pipelines/create-run - this link has details about same.

0 Votes 0 ·

Hello @CharbelDaiaMartins-0878,

Just checking in to see if the above answer helped. If this answers your query, do click Accept Answer and Up-Vote for the same. And, if you have any further query do let us know.

0 Votes 0 ·