I want to connect mysql with pyspark. I am using jupyter notebook to run pyspark. However when I do this,
dataframe_mysql = sqlContext.read.format("jdbc").options(
url="jdbc:mysql://localhost:3306/playground",
driver = "com.mysql.jdbc.Driver",
dbtable = "play1",
user="root",
password="sp123").load()
I get an error as
Py4JJavaError: An error occurred while calling o89.load. : java.lang.ClassNotFoundException: com.mysql.jdbc.Driver.
How can I resolve this error and load mysql data in pyspark dataframe?