Custom libraries (wheel) for ADF Databricks Python activity run on serverless compute

Krzysztof Przysowa 20 Reputation points
2024-05-01T12:30:52.2366667+00:00

I want to be able to execute Python scripts (via Databricks Python) from Azure Data Factory using serverless compute.

Serverless compute does not support cluster level (compute scoped) libraries.
In databricks workflows, it is being done as "environments and libraries"
https://learn.microsoft.com/en-us/azure/databricks/workflows/jobs/run-serverless-jobs#configure-environments-and-dependencies-for-non-notebook-tasks
I would like to be able to point to databricks workspace file location or volume file location for required library.

Please add this functionality.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,963 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,719 questions
{count} votes