Do we have direct connector to connect delta share(Delta table) and load data into Azure SQL database?

Ravi Alwal 20 Reputation points
2023-03-20T11:18:44.07+00:00

Description:

Using Delta Share Concept, we tried to load the data from Data brick notebook by creating Delta Share CatLog. Data provider has given the delta table access to Data receiver. However this approach need to built Data brick Notebook and need additional maintenance also data receiver has not installed data brick workspace.

To overcome this additional activities, we need direct connector to connect delta share CatLog and load into our Destination Azure SQL database table. Also, Is there any 3rd party tool available? so that we can connect directly to delta share(Delta table) and load data directly into Azure SQL?

Azure Data Share
Azure Data Share
An Azure service that is used to share data from multiple sources with other organizations.
45 questions
Azure SQL Database
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,938 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,599 questions
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 77,751 Reputation points Microsoft Employee
    2023-03-21T08:36:06.91+00:00

    @Ravi Alwal Thanks for the question and using MS Q&A platform.

    Yes, there are a few ways to connect Delta tables in Databricks to Azure SQL Database for loading data:

    1. Use Azure Data Factory: Azure Data Factory supports copying data from Delta tables in Databricks to Azure SQL Database. You can use the Databricks Delta connector in Data Factory to connect to your Delta table and then configure a pipeline to load data into your Azure SQL Database.
    2. Use Azure Synapse Analytics: Azure Synapse Analytics also supports loading data from Delta tables in Databricks into Azure SQL Database. You can use the PolyBase feature in Synapse Analytics to create an external table that points to your Delta table in Databricks, and then use a T-SQL query to load data from the external table into your Azure SQL Database.
    3. Use the Databricks JDBC/ODBC connector: Databricks provides a JDBC/ODBC connector that you can use to connect to your Delta table and load data into Azure SQL Database. You can use a third-party ETL tool or a custom script to connect to the Delta table using the JDBC/ODBC connector and then load data into your Azure SQL Database.
    4. Use a third-party ETL tool: There are several third-party ETL tools available that support connecting to Delta tables in Databricks and loading data into Azure SQL Database. Some examples include Talend, Informatica, and Matillion.

    In summary, there are multiple ways to connect Delta tables in Databricks to Azure SQL Database for loading data. You can choose the method that best fits your requirements and technical expertise.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


0 additional answers

Sort by: Most helpful