question

RamyaHarinarthini-MSFT avatar image
0 Votes"
RamyaHarinarthini-MSFT asked vinayakhegde-4562 commented

Azure databricks throwing 403 error

I am using this command in the notebook

 configs = {"fs.azure.account.auth.type": "OAuth", 
            "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider", 
            "fs.azure.account.oauth2.client.id": "XXXXX Application ID in Azure Active Directory", 
            "fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = "secret",key = "Key"), 
            "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/Directory ID/oauth2/token"} 
 dbutils.fs.mount( 
   source = "abfss://raw@mydatalake.dfs.core.windows.net/", 
   mount_point = "/mnt/raw", extra_configs = configs) 
      
 dbutils.fs.ls("mnt/raw/") 

But it throws this error:

 StatusCode=403 
 StatusDescription=This request is not authorized to perform this operation using this permission. 
 ErrorCode=AuthorizationPermissionMismatch 

I have granted Service Principal the required permissions (Made it owner) But still it is throwing this error. Looks like this is a bug in the product.

[Note: As we migrate from MSDN, this question has been posted by an Azure Cloud Engineer as a frequently asked question]


Sourced from MSDN – Azure Databricks


azure-databricks
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

PRADEEPCHEEKATLA-MSFT avatar image
0 Votes"
PRADEEPCHEEKATLA-MSFT answered Manishchoudhary-3986 commented

Welcome to the Microsoft Q&A platform.

Happy to answer your questions.

Note: When performing the steps in the Assign the application to a role, make sure to assign the Storage Blob Data Contributor role to the service principal.

Repro: I have provided owner permission to the service principal and tried to run the “dbutils.fs.ls("mnt/azure/")”, returned same error message as above.

8045-add1.jpg

Solution: Now assigned the Storage Blob Data Contributor role to the service principal.

7983-add2.jpg

Finally, able to get the output without any error message after assigning Storage Blob Data Contributor role to the service principal.


7984-add3.jpg

For more details, refer “Tutorial: Azure Data Lake Storage Gen2, Azure Databricks & Spark”.

Hope this helps.

Sourced from MSDN – Azure Databricks



add1.jpg (174.1 KiB)
add2.jpg (184.3 KiB)
add3.jpg (105.9 KiB)
· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Valid answer.

0 Votes 0 ·
IshantKaushik-8681 avatar image
1 Vote"
IshantKaushik-8681 answered vinayakhegde-4562 commented

I also faced same issue but later figured out that you need to have only (Storage Blob Data Contributor) Role specified on your data lake for your service principal.
If you have given only just (Contributor) role it will not work.
Or both Contributor and Storage Blob Data Contributor it will not work.
You have to just provide Storage Blob Data Contributor on your data lake gen 2
117592-image.png



image.png (162.5 KiB)
· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

This worked.. Thanks.. You just need to have only one permission "Storage Blob Data Contributor"

0 Votes 0 ·