When I tried to use Spark-Sql
against Hive
, the error like below is thrown.
Exception in thread "main" java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT
at org.apache.spark.sql.hive.HiveUtils$.formatTimeVarsForHiveClient(HiveUtils.scala:204)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:90)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
As per SO thread hive-stats-jdbc-timeout-for-hive-queries-in-spark andspark-on-hive-sql-query-error-nosuchfielderror-hive-stats-jdbc-timeout, this issue occurs when you're using specific version of Spark and Hive, actually, if you want to use latest version spark like 2.4.3 and latest Hive like 3.1.1, it can't be skipped.
We can check this community thread for details, https://issues.apache.org/jira/browse/SPARK-13446, no update since Feb.2019.
So so you know any update about this issue? If we want to skip it ourselves in source level, any clue about how to make it?
Thanks for your help in advance.