0

I'm trying to Mount Azure Blob storage containers to DBFS. Implementation is as below. got below following error . Not sure why I'm getting an error Did you remove the AWS key for the mount point? when I'm trying to connect to azure blob storage.

Do I missing anything here? Could you please help me solving this issue - Thanks --

Code Implementation

dbutils.fs.mount(
  source = "wasbs://<container-name>@<storage-account-name>.blob.core.windows.net",
  mount_point = "/mnt/iotdata",
  extra_configs = {"fs.azure.account.key.<storage-account-name>.blob.core.windows.net":dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>")})

Error

ExecutionError: An error occurred while calling o275.mount.
: com.databricks.backend.daemon.data.common.InvalidMountException: The backend could not get session tokens for path /mnt. Did you remove the AWS key for the mount point?
    at com.databricks.backend.daemon.data.common.InvalidMountException$.apply(DataMessages.scala:612)
    at com.databricks.backend.daemon.data.filesystem.MountEntryResolver.resolve(MountEntryResolver.scala:84)
CHEEKATLAPRADEEP
  • 12,191
  • 1
  • 19
  • 42
Sri
  • 35
  • 7

1 Answers1

0

I followed below steps to successfully mount and read data in dataframe from Azure blob storage.

Step1: Import pyspark and SparkSession

import pyspark
from pyspark.sql import SparkSession

enter image description here

Step2: Use below command to unmount any existing mount point

dbutils.fs.unmount("/mnt")

enter image description here

Step3: To get key of storage

Go to storage account and select Access keys

enter image description here

Now click on Show keys as shown in below screenshot

enter image description here

Then copy first key.

enter image description here

Step4: To mount blob storage use below code:

dbutils.fs.mount(
  source = "wasbs://<container_name>@<storage_account_name>.blob.core.windows.net",
  mount_point = "/mnt/Sales",
  extra_configs = {"fs.azure.account.key.<storage_account_name>.blob.core.windows.net": "key"})

enter image description here

From your given code you must remove dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>") and replace it with Azure blob storage key only.

Step5: Read data in dataframe from blob storage container

enter image description here

Step6: Dataframe

enter image description here

Step7: Original CSV file in blob storage

enter image description here

Abhishek K
  • 3,047
  • 1
  • 6
  • 19
  • it didn't work for me. Are there any permissions I need to look into before mounting? or any extra permissions i need to give on the access key or scope ?? – Sri Aug 31 '21 at 19:02
  • No, there are no permissions needed. Steps given in above answer are enough. I have updated my answer to show where you can get key of storage account. – Abhishek K Sep 01 '21 at 05:33