question

Stramzik-9592 avatar image
0 Votes"
Stramzik-9592 asked Stramzik-9592 commented

Unable to mount datalake gen1 to databricks

I was mounting the Datalake Gen1 to Databricks for accessing and processing files, It was working great for the past 1 year and all of a sudden I'm getting the below error

 configs = {"df.adl.oauth2.access.token.provider.type": "ClientCredential",
            "df.adl.oauth2.client.id": dbutils.secrets.get(scope = "scope1", key = "scope1ID"),
            "df.adl.oauth2.credential": dbutils.secrets.get(scope = "scope1", key = "scope1Secret"),
            "df.adl.oauth2.refresh.url": "https://login.microsoftonline.com/####REPLACED FOR SECURITY###/oauth2/token"}
    
 # Optionally, you can add <directory-name> to the source URI of your mount point.
 dbutils.fs.mount(
   source = "adl://company.azuredatalakestore.net/sandbox/",
   mount_point = "/mnt/testFolder",
   extra_configs = configs)

 Error :
 ExecutionError: An error occurred while calling o334.mount.
 : com.microsoft.azure.datalake.store.ADLException: Error creating directory /
 Error fetching access token
 Operation null failed with exception java.io.IOException : AADToken: HTTP connection failed for getting token from AzureAD due to timeout.  Client Request Id :##### MASKED FOR SECURITY## Latency(ns) : 73677066


Can somebody help me fix this?

azure-databricksazure-data-lake-storage
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

HimanshuSinha-MSFT avatar image
0 Votes"
HimanshuSinha-MSFT answered

Hello @Stramzik-9592 ,
Thanks for the ask and using the Microsoft Q&A platform .
Since you mentioned that it had been working fine for 1 year , there is a good chance that the client secret has expired on the Azure side . When we create the client secret we need to set an expiry date also . The below screen will make it more clear .

119100-image.png


Please do let me know how it goes .
Thanks
Himanshu
Please do consider clicking on "Accept Answer" and "Up-vote" on the post that helps you, as it can be beneficial to other community members


[1]: /answers/storage/attachments/119203-image.png


image.png (197.5 KiB)
image.png (16.3 KiB)
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Stramzik-9592 avatar image
0 Votes"
Stramzik-9592 answered Stramzik-9592 commented

@HimanshuSinha-MSFT

We are using a ServicePrinciple. Apparently the Client Secret for the Service Principle had expired and we did update the CLient secret.

If I use the same client secret to read files to a spark dataframe directly without mounting it works. However when I try to mount I get the error.

· 4
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hello @Stramzik-9592 ,

If I use the same client secret to read files to a spark dataframe directly without mounting it works. However when I try to mount I get the error.

This is a good piece of info and looking at the code which you shared I think you will have to update the secret using

databricks secrets put --scope <scope-name> --key <key-name>

https://docs.microsoft.com/en-us/azure/databricks/security/secrets/secrets#create-a-secret-in-a-databricks-backed-scope


Just to make sure that you understand that ADB does support AKV for key store , but ADB does have an key store support of its own also , and I think that you are using the ADB one . If I were I could have tried to the list the secrets with

databricks secrets list --scope <scope-name>

and if it returns result , then update the same with the new client ID .

Thanks
Himanshu




0 Votes 0 ·

Hi @HimanshuSinha-MSFT

I followed the steps you have mentioned

databricks secrets list --scope <scope-name>

The above command did give me the list of secrets

I tried to PUT the scope using

databricks secrets put --scope scope1 --key secret1

I get the below error

Error: b'{"error_code":"BAD_REQUEST","message":"Cannot write secrets to Azure KeyVault-backed scope scope1"}'

so does that mean that we are using AKV Key store and not the internal one.

Just another piece of information which I would like to add, The Secret was suppose to expire on 21st July and the Mounting issue started occurring on 19th july it self.. Is there any specific setting from admin where they can disable the mounting option?

0 Votes 0 ·

so does that mean that we are using AKV Key store and not the internal one.
[Himanshu] : Correct , you are on Azure key vault . Sorry the thing which I suggested was not of much use .

Just another piece of information which I would like to add, The Secret was suppose to expire on 21st July and the Mounting issue started occurring on 19th july it self.. Is there any specific setting from admin where they can disable the mounting option?

[Himanshu] : I think who ever did the key rotation did that activity two days before the expiry data , which is the correct thing to do . I think you are struggling this for a while , just one more question . They did the rotated the key , but since the above error says that Adb is tied to AKV , the question is did they updated the keys in AKV ?


Also since you are struggling this for a long time , i suggest if you have a support plan you may file a support ticket, else could you please send an email to azcommunity@microsoft.com with the below details, so that we can create a one-time-free support ticket for you to work closely on this matter.
Subscription ID:
Subject : Attn Himanshu
Please let me know once you have done the same.



0 Votes 0 ·
Show more comments
VaibhavChaudhari avatar image
0 Votes"
VaibhavChaudhari answered Stramzik-9592 commented

Is the new client secret updated in Key vault as well?

Also see below document which says, if you have already have some mounts pointing to same ADLS Gen1, try to unmount them first.

https://docs.microsoft.com/en-us/azure/databricks/kb/cloud/adls-gen1-mount-problem


Please don't forget to Accept Answer and Up-vote if the response helped -- Vaibhav

· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

@VaibhavChaudhari

Yep new secret is updated on KeyVault. I can use all the same credentials to read the file from datalake using Spark.config connection. However I cant mount the datalake gen 1

I have tried to unmount but it says nothing is mounted.

0 Votes 0 ·

Did you run below to check all mounts? Just cross checking -

display(dbutils.fs.mounts())

0 Votes 0 ·

Yes! I did run but those mounts are from different users in our Org. I have checked the mount point and nothing is linked to the Datalake folder which i'm trying to mount.

0 Votes 0 ·