Here is my code:
uname = "xxxxx"
pword = "xxxxx"
dbUrl = "jdbc:postgresql:dbserver"
table = "xxxxx"
jdbcDF = spark.read.format("jdbc").option("url", dbUrl).option("dbtable",table).option("user", uname).option("password", pword).load()
I'm getting a "No suitable driver" error after adding the postgres driver jar (%Addjar -f https://jdbc.postgresql.org/download/postgresql-9.4.1207.jre7.jar). Is there a working example of loading data from postgres in pyspark 2.0 on DSX?