I have tried with p12 keyfile, it is successfully working and I was able to fetch data from gcs bucket. But with json keyfile sparksession is not getting the json config values. Instead, It is going for default metadata. I am using maven and IntelliJ for development. Below is the code snippet
def main(args: Array[String]): Unit = {
System.out.println("hello gcp connect")
System.setProperty("hadoop.home.dir", "C:/hadoop/")
val sparkSession =
SparkSession.builder()
.appName("my first project")
.master("local[*]")
.config("spark.hadoop.fs.gs.project.id", "shaped-radius-297301")
.config("spark.hadoop.fs.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem")
.config("spark.hadoop.fs.AbstractFileSystem.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS")
.config("spark.hadoop.google.cloud.project.id", "shaped-radius-297301")
.config("spark.hadoop.google.cloud.auth.service.account.enable", "true")
.config("spark.hadoop.google.cloud.auth.service.account.email", "service-account@shaped-radius-297301.iam.gserviceaccount.com")
.config("spark.hadoop.google.cloud.service.account.json.keyfile", "C:/Users/shaped-radius-297301-5bf673d7f0d2.json")
.getOrCreate()
sparkSession.sparkContext.addFile("gs://test_bucket/sample1.csv")
sparkSession.read.csv(SparkFiles.get("sample1.csv")).show()