I am loading a DataFrame like so,
df = sparkSession.read.format("jdbc")[...]
and then writing to a parquet file.
df.write.mode(writeMode).parquet(location)
All numeric columns in the Dataset have type DecimalType(38, 10)
but when I try to write one specific table to a parquet file I get the following in a stack trace:
java.lang.IllegalArgumentException: requirement failed: Decimal precision 69 exceeds max precision 38
I am having trouble debugging this issue, how can I find the row(s) that are causing this exception?