Hi,
We a large number of records to Synapse dedicated pool using the synapse connector. We are able to do this using userid and password and client id and secret. However, we wanted to check if this is possible using client id and certificate and/or private key.
We are using the below code to set the spark.conf for the Synapse connector. he code we have looks like this :
# Defining the service principal credentials for the Azure storage account
spark.conf.set("fs.azure.account.auth.type", "OAuth")
spark.conf.set("fs.azure.account.oauth.provider.type", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set("fs.azure.account.oauth2.client.id", "<application-id>")
spark.conf.set("fs.azure.account.oauth2.client.secret", "<service-credential>")
spark.conf.set("fs.azure.account.oauth2.client.endpoint", "https://login.microsoftonline.com/<directory-id>/oauth2/token")
# Defining a separate set of service principal credentials for Azure Synapse Analytics (If not defined, the connector will use the Azure storage account credentials)
spark.conf.set("spark.databricks.sqldw.jdbc.service.principal.client.id", "<application-id>")
spark.conf.set("spark.databricks.sqldw.jdbc.service.principal.client.secret", "<service-credential>")
We're aware of MSAL and we have successfully used this with secret and certificates. However, this is quite slow compared to the Spark Synapse connector for ingestion of large volumes of data.
Is there a way to use Spark Synapse connector with SPN+ certificate?
or upvote
button whenever the information provided helps you.