4

I followed the documentation azure-datalake-gen2-sp-access and I mounted a ADLS2 storage in databricks, but when I try to see data from the GUI I get the next error:

Cluster easy-matches-cluster-001 does not have the proper credentials to view the content. Please select another cluster.

enter image description here

I don't find any documentation, only something about premium databricks, so I can only access with a premium databricks resource?

Edit1: I can see the mounted storage with dbutils.

enter image description here

Andrés Bustamante
  • 442
  • 1
  • 4
  • 15
  • Before trying to access through GUI , can you please try to listed the mount location based on this command : display(dbutils.fs.mounts()) and see it whether your folder being listed our there – Karthikeyan Rasipalay Durairaj Oct 06 '21 at 06:17
  • What version of Databricks Runtime are you using ? Can you try Access ADLS Gen2 directly ? – KarthikBhyresh-MT Oct 06 '21 at 12:09
  • Hi @KarthikBhyresh-MT I can acces to the storage directly I have admin acces to all the resources and the version is '9.0.x-scala2.12' – Andrés Bustamante Oct 06 '21 at 13:16
  • Hi @KarthikeyanRasipalayDurairaj I can see with dbutil I edited the question to show the evidence – Andrés Bustamante Oct 06 '21 at 13:17
  • @AndrésBustamante - Just try . can you please make sure , same location has't been mounted in 2 different /mnt location and try to unmounted and mounted again. ref please - https://stackoverflow.com/questions/58996954/mount-error-when-trying-to-access-the-azure-dbfs-file-system-in-azure-databricks – Karthikeyan Rasipalay Durairaj Oct 07 '21 at 15:27

3 Answers3

1

After mounting the storage account, please do run this command do check if you have data access permissions to the mount point created.

dbutils.fs.ls("/mnt/<mount-point>")
  • If you have data access - you will see the files inside the storage account.
  • Incase if you don't have data access- you will get this error - "This request is not authorized to perform this operation using this permissions", 403.

If you are able to mount the storage but unable to access, check if the ADLS2 account has the necessary roles assigned.

enter image description here

I was able to repro the same. Since you are using Azure Active Directory application, you would have to assign "Storage Blob Data Contributor" role to Azure Active Directory application too.

Below are steps for granting blob data contributor role on the registered application

1. Select your ADLS account. Navigate to Access Control (IAM). Select Add role assignment.

enter image description here

2. Select the role Storage Blob Data Contributor, Search and select your registered Azure Active Directory application and assign.

Back in Access Control (IAM) tab, search for your AAD app and check access.

enter image description here

3. Run dbutils.fs.ls("/mnt/<mount-point>") to confirm access.

enter image description here

enter image description here

CHEEKATLAPRADEEP
  • 12,191
  • 1
  • 19
  • 42
KarthikBhyresh-MT
  • 4,560
  • 2
  • 5
  • 12
0

Solved unmounting, mounting and restarting the cluster. I followed this doc: https://learn.microsoft.com/en-us/azure/databricks/kb/dbfs/remount-storage-after-rotate-access-key

Andrés Bustamante
  • 442
  • 1
  • 4
  • 15
0

If you still encounter the same issue when Access Control is checked. Do the following.

  1. Use dbutils.fs.unmount() to unmount all storage accounts.
  2. Restart the cluster.
  3. Remount