I have installed the postgresql driver package when I run spark-shell after I have ssh'ed into the EMR spark-shell --packages org.postgresql:postgresql:9.4-1206-jdbc42
. I then import org.postgresql. I want to create a spark Dataframe object, so I try to access a table from an s3 instance.
sqlContext.load("jdbc", Map("url" -> "jdbc:postgresql://pathto.table.region.rds.amazonaws.com:5432/table?user=username&password=password","dbtable" -> "table"))
this gives me a java.sql.SQLException: No suitable driver
error.
I have looked into this question with a similar problem. But I want to be able to run the driver through spark-shell.