I am writing df data into one of Teradata table then i am getting below error. I am able to read the data just getting error for writing. I am not getting why same driver is working for writing the dataframe.
df=spark.sql("""select * from tbl_name""")
config_dict={"JDBCURL":"jdbc:teradata//${jdbcHostname}/database=${jdbcDatabase}","JDBCDriver":"com.teradata.jdbc.TeraDriver","DB_User":"dbc","DB_PWD":"dbc"}
jdbcUrl=config_dict["JDBCURL"]
jdbcDriver =config_dict["JDBCDriver"]
user=config_dict["DB_User"]
password=config_dict["DB_PWD"]
df=spark.sql("""select * from testing""")
df.write.format("jdbc") \
.mode("overwrite") \
.option("url", jdbcUrl) \
.option("dbtable","database.tbl_name") \
.option("user", user) \
.option("password",password) \
.save()
error:>> java.sql.SQLException: No suitable driver
can someone help me on it.
I tried to connect teradata manually then i am able to connect with same credentials.