Dear All,
I am trying to connect a (new) Synapse Analytics workspace to an external storage account (datalake gen2) using a py spark notebook (using ddedicated spark pool).
In order to read with a simple query construct I have added my current account in use (userXXX) as Storage Blob Data Contributor to the container1 (using IAM).
When running the following command in a cell
account_name = 'mystorageaccountname'
container_name = 'container1'
relative_path = 'path_to_parquet_files'
adls_path = 'abfss://%s@%s.dfs.core.windows.net/%s' % (container_name, account_name, relative_path)
df = spark.read.parquet(adls_path)
df.show()
I receive the error
Py4JJavaError : An error occurred while calling o334.parquet.
: Operation failed: "This request is not authorized to perform this operation.", 403, HEAD, https://mystorageaccountname.dfs.core.windows.net/container1//?upn=false&action=getAccessControl&timeout=90
Do you have suggestion in order to solve this issue? Is there something I have forgot?
Thanks
Ra