I am using this command in the notebook
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": "XXXXX Application ID in Azure Active Directory",
"fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = "secret",key = "Key"),
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/Directory ID/oauth2/token"}
dbutils.fs.mount(
source = "abfss://raw@mydatalake.dfs.core.windows.net/",
mount_point = "/mnt/raw", extra_configs = configs)
dbutils.fs.ls("mnt/raw/")
But it throws this error:
StatusCode=403
StatusDescription=This request is not authorized to perform this operation using this permission.
ErrorCode=AuthorizationPermissionMismatch
I have granted Service Principal the required permissions (Made it owner) But still it is throwing this error. Looks like this is a bug in the product.
[Note: As we migrate from MSDN, this question has been posted by an Azure Cloud Engineer as a frequently asked question]
Sourced from MSDN – Azure Databricks



