I am using Spark JDBC to connect to MySQL table. When it reads the table, the schema contains all columns as nullable. Whereas primary keys should have nullable false. I am using MySQL 5.1.8 version driver.
I am using
session.read.jdbc(s"${destOptions.getProperty("connection_string")}?useCompression=true&useSSL=false&autoReconnect=true", config.srcTable,andLogicPredicate, destOptions).selectExpr(primaryKeyArray: _*)