1

I use this code "in the link below" to test Spark before running it on my own data

https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/ml/JavaFPGrowthExample.java

these are my dependencies for hadoop and spark:

org.apache.spark spark-mllib_2.12 2.4.0 runtime

org.apache.hadoop hadoop-common 3.2.0

I get This exception:

Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3038)
at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3036)
at org.apache.spark.ml.util.Instrumentation.logDataset(Instrumentation.scala:60)
at org.apache.spark.ml.fpm.FPGrowth.$anonfun$genericFit$1(FPGrowth.scala:169)
at org.apache.spark.ml.util.Instrumentation$.$anonfun$instrumented$1(Instrumentation.scala:183)
at scala.util.Try$.apply(Try.scala:209)
at org.apache.spark.ml.util.Instrumentation$.instrumented(Instrumentation.scala:183)
at org.apache.spark.ml.fpm.FPGrowth.genericFit(FPGrowth.scala:165)
at org.apache.spark.ml.fpm.FPGrowth.fit(FPGrowth.scala:162)
at Spark.main(Spark.java:45)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.9.5
at com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:64)
at com.fasterxml.jackson.module.scala.JacksonModule.setupModule$(JacksonModule.scala:51)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:751)
at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
... 14 more

Is it a known problem and how can I fix it?

Note:

I saw the same problem for Scala

Spark2.1.0 incompatible Jackson versions 2.7.6

but I couldn't do the same in Java

Ezz Eddin Othman
  • 163
  • 1
  • 12

0 Answers0