"I have a requirement where I need to mount the GCP storage bucket on the GCP-Databrciks cluster but I can't find any appropriate documents on it; my intention is to use that bucket as an external table location"
Asked
Active
Viewed 436 times
0
-
Hi Shahid, do let me know if my recommendations were helpful. – Vaidehi Jamankar Nov 07 '22 at 06:08
1 Answers
0
Databricks mounts create a link between a workspace and cloud object storage, which enables you to interact with cloud object storage using familiar file paths relative to the Databricks file system. Tight integration with Google Cloud Storage, BigQuery and the Google Cloud AI Platform enables Databricks to work seamlessly across data and AI services on Google Cloud.To read or write from a GCS bucket, you must create an attached service account and you must associate the bucket with the service account when creating a cluster.Connect to the bucket directly with a key that you generate for the service account.
For documentation for working with mounted GCS bucket, see Mounting cloud object storage on Databricks

Vaidehi Jamankar
- 1,232
- 1
- 2
- 10