question

Abhinav-Sharma avatar image
0 Votes"
Abhinav-Sharma asked KranthiPakala-MSFT commented

Azure data factory - Azure Databricks Notebook

I need to load output of "Look Up" activity into "Azure Databricks Notebook". Is it possible to load output of ADF Look Up activity directly into Azure databricks notebook ?

azure-data-factory
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

KranthiPakala-MSFT avatar image
1 Vote"
KranthiPakala-MSFT answered KranthiPakala-MSFT commented

Hi @Abhinav-Sharma ,

As per my research and discussion with internal teams, here are few workarounds you could try to overcome the limitations discussed in our previous comments:

  1. Instead of passing the actual data from the Lookup into the notebook you could try passing the query used in the Lookup to the notebook and have the notebook read from the SQL DB.

  2. Otherwise, you could stage that data in a parquet file and pass the location string into the notebook as well.

  3. The third option is to use Mapping data flows instead of Databricks if it fits your transformation logic.

Hope this info helps.



  • Please accept an answer if correct. Original posters help the community find answers faster by identifying the correct answer. Here is how.

  • Want a reminder to come back and check responses? Here is how to subscribe to a notification.


· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi @Abhinav-Sharma ,

We still have not heard back from you. Just wanted to check if the above suggestion was helpful? If it answers your query, please do click “Accept Answer” and/or Up-Vote, as it might be beneficial to other community members reading this thread. And, if you have any further query do let us know.

0 Votes 0 ·

Hi @KranthiPakala-MSFT,

Thank you for the response. I will go with the workarounds.

Thanks for clearing the doubt and confirming on the workaround approach.

0 Votes 0 ·

You are welcome :)

0 Votes 0 ·
VaibhavChaudhari avatar image
1 Vote"
VaibhavChaudhari answered KranthiPakala-MSFT commented

It is possible to pass the value to a parameter which can be used inside Databricks notebook

Reference - https://docs.microsoft.com/en-us/azure/data-factory/transform-data-using-databricks-notebook#create-a-pipeline


Please don't forget to Accept Answer and Up-vote if the response helped -- Vaibhav

· 4
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Thank you. This is very helpful.

As per my understanding we can pass only 2 MB of data using parameters.

The Look up activity which is being used is fetching 2k to 3k records. Is there a way through which we can pass more than 2 MB data as input to notebook activity.

0 Votes 0 ·

Hi @Abhinav-Sharma,

Thanks for your response. Are you referring to Lookup activity limitation of 5K records or 4MB data? or a different limitation? Would you please share the public document where 2MB limitation is called out?


0 Votes 0 ·

Hi Kranti,

I am referring to input and output parameters limitation of databricks notebook activity. Please refer below link.

https://docs.microsoft.com/en-us/azure/data-factory/transform-data-databricks-notebook

My question was, is there any way that we can pass more that 2 MB of data as input to databricks activity in azure data factory.

The scenario is - I am fetching data from SQL DB using lookup activity and then want to pass the data into notebook activity for further transformation.

0 Votes 0 ·

Hi @Abhinav-Sharma,

Apologies for the delayed response and thanks for clarifying on the requirement. At this time it is a hard limit and I have reached out to product team to check if there is a way to overcome this limitation. Will get back to you as soon as we have an update from the team.

Thank you for your patience.

0 Votes 0 ·