0

I have an sql script which creates temp tables valid for only that session. Now after running the script, I am trying to read data from the table through spark and then process it. Below is the code I have code for spark read.

sparkSession.read().format("jdbc").option("url", 
jdbcURL).option("dbtable", tableOrQuery).option("user", 
userName).option("password", password)
      .option("driver", driverName).load();

Now I need to pass the jdbc connection I created so that spark can read data in the same session. Is this possible ?

HariJustForFun
  • 521
  • 2
  • 8
  • 17
  • Incidentally I answered [a similar question](https://stackoverflow.com/q/54417010/10938362) not so long ago - TL;DR; there can be no such option whatsoever. – user10938362 Feb 12 '19 at 18:50

1 Answers1

1

No, you cannot pass jdbc connection to spark. It will manage JDBC connection by itself.

JdbcRelationProvider Create Connection

JdbcUtils connect

Duy Nguyen
  • 985
  • 5
  • 9
  • Thanks. Do you have any suggestions what I can do instead ? – HariJustForFun Feb 12 '19 at 16:52
  • You can get connection metadata and build Spark URL according to it, solution is in https://stackoverflow.com/questions/5718952/how-to-get-database-url-from-java-sql-connection – Duy Nguyen Feb 13 '19 at 08:03