1

I want to access spark from Play Framework using Scala.

I am trying this example: Play with Apache Spark.

I adjusted the libraries in the build.sbt file (current versions) and the compilation worked.

I ran the example using sbt run and when I wanted to receive the value from Spark the following exception occurs:

[RuntimeException: java.lang.ExceptionInInitializerError]

The tutorial shows how you can access Spark using Play Framework and Scala.

Why I am receiving this error?

I am using Scala 2.12.9 and Apache Spark 2.4.3.

build.sbt

    name := """play-with-spark"""
    organization := "ch.martin"

    version := "1.0-SNAPSHOT"

    lazy val root = (project in file(".")).enablePlugins(PlayScala)

    scalaVersion := "2.12.9"

    libraryDependencies += guice
    libraryDependencies ++= Seq(
      guice,
      "org.joda" % "joda-convert" % "2.2.1",
      "org.scalatestplus.play" %% "scalatestplus-play" % "4.0.3" % Test,
      "org.apache.spark" % "spark-core_2.12" % "2.4.3",
      "org.apache.spark" % "spark-sql_2.12" % "2.4.3",
      "org.apache.spark" % "spark-mllib_2.12" % "2.4.3",
      "org.apache.hadoop" % "hadoop-client" % "3.1.2"

    )

    dependencyOverrides ++= Seq(
      "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7",
      "com.google.guava" % "guava" % "28.1-jre"

    )

SparkTest.scala


    def Example: Int = {

        println("TEST: Start running spark")

        val sparkS = SparkSession.builder().master("local[4]").getOrCreate()

        println("TEST: Spark session successfully set up")

        import sparkS.implicits._

        //here the exception occurs

        val sum = Seq(3, 2, 4, 1, 0, 30,30,40,50,-4).toDS

        sum.count.toInt

      }

Calling Spark in homeController.scala


    def test = Action { implicit request =>
        val sum = SparkTest.Example
        Ok(views.html.test_args(s"A call to spark, with result: $sum"))
      }

The stacktrace of the exception is:

play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[RuntimeException: java.lang.ExceptionInInitializerError]]
        at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:351)
        at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:267)
        at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:448)
        at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:446)
        at scala.concurrent.Future.$anonfun$recoverWith$1(Future.scala:417)
        at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
        at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
        at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:92)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
Caused by: java.lang.RuntimeException: java.lang.ExceptionInInitializerError
        at play.api.mvc.ActionBuilder$$anon$10.apply(Action.scala:431)
        at play.api.mvc.Action.$anonfun$apply$2(Action.scala:98)
        at play.api.libs.streams.StrictAccumulator.$anonfun$mapFuture$4(Accumulator.scala:184)
        at scala.util.Try$.apply(Try.scala:213)
        at play.api.libs.streams.StrictAccumulator.$anonfun$mapFuture$3(Accumulator.scala:184)
        at scala.Function1.$anonfun$andThen$1(Function1.scala:57)
        at scala.Function1.$anonfun$andThen$1(Function1.scala:57)
        at scala.Function1.$anonfun$andThen$1(Function1.scala:57)
        at play.api.libs.streams.StrictAccumulator.run(Accumulator.scala:219)
        at play.core.server.AkkaHttpServer.$anonfun$runAction$4(AkkaHttpServer.scala:441)
Caused by: java.lang.ExceptionInInitializerError: null
        at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
        at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:247)
        at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:296)
        at org.apache.spark.sql.Dataset.$anonfun$count$1(Dataset.scala:2830)
        at org.apache.spark.sql.Dataset.$anonfun$count$1$adapted(Dataset.scala:2829)
        at org.apache.spark.sql.Dataset.$anonfun$withAction$2(Dataset.scala:3364)
        at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
        at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
        at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.8.7
        at com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:64)
        at com.fasterxml.jackson.module.scala.JacksonModule.setupModule$(JacksonModule.scala:51)
        at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
        at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:745)
        at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
        at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
        at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
        at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:247)
        at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:296)

I expected a response with the sum of the numbers: 156

Rakshith
  • 644
  • 1
  • 8
  • 24
eisem
  • 185
  • 1
  • 10
  • Looking at your stacktrace and googling, I saw this [Spark2.1.0 incompatible Jackson versions 2.7.6](https://stackoverflow.com/questions/43841091/spark2-1-0-incompatible-jackson-versions-2-7-6). – George Leung Sep 10 '19 at 08:26
  • It works with the solution here: https://stackoverflow.com/questions/43841091/spark2-1-0-incompatible-jackson-versions-2-7-6. Thank you – eisem Sep 10 '19 at 09:45
  • 1
    Possible duplicate of [Spark2.1.0 incompatible Jackson versions 2.7.6](https://stackoverflow.com/questions/43841091/spark2-1-0-incompatible-jackson-versions-2-7-6) – Jeffrey Chung Sep 10 '19 at 10:25

0 Answers0