I have a MySQL table with a TIMESTAMP(3)
column that I want to pull in as a Spark DataFrame. The MySQL JDBC driver is failing to read the TIMESTAMP(3)
column.
Is there a configuration or an efficient way I can specify my own Encoder
to correctly parse this column?
Schema:
CREATE TABLE table_x
(
user_id VARCHAR(255) NOT NULL,
item_id VARCHAR(255) NOT NULL,
serialized_item MEDIUMTEXT NOT NULL,
creation_date TIMESTAMP DEFAULT 'CURRENT_TIMESTAMP' NOT NULL,
last_updated_date TIMESTAMP(3) DEFAULT 'CURRENT_TIMESTAMP(3)' NOT NULL
);
Code:
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.types._
val conf = new SparkConf().
setMaster("local[4]").
setAppName("AppName")
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
import sqlContext.implicits._
val props = new Properties()
props.setProperty("user", "...")
props.setProperty("password", "...")
val df = sqlContext.read.jdbc("...", "table_x", props)
df.take(10).foreach(println)
Stacktrace:
java.sql.SQLException: Cannot convert value '2016-03-30 09:41:03.043' from column 6 to TIMESTAMP.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1055)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:926)
at com.mysql.jdbc.ResultSetRow.getTimestampFast(ResultSetRow.java:1321)
at com.mysql.jdbc.BufferRow.getTimestampFast(BufferRow.java:573)
at com.mysql.jdbc.ResultSetImpl.getTimestampInternal(ResultSetImpl.java:6617)
at com.mysql.jdbc.ResultSetImpl.getTimestamp(ResultSetImpl.java:5943)
...
Caused by: java.lang.IllegalArgumentException: nanos > 999999999 or < 0
at java.sql.Timestamp.setNanos(Timestamp.java:389)
at com.mysql.jdbc.TimeUtil.fastTimestampCreate(TimeUtil.java:1135)
at com.mysql.jdbc.ResultSetImpl.fastTimestampCreate(ResultSetImpl.java:1030)
at com.mysql.jdbc.ResultSetRow.getTimestampFast(ResultSetRow.java:1310)
...