1

I installed spark 2.4.3 and sbt 1.2.8. I'm under windows 10 pro.

java -version gives:

java version "1.8.0_211"
Java(TM) SE Runtime Environment (build 1.8.0_211-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.211-b12, mixed mode)

I try to execute the Quick Start from spark.apache.org

All runs fine until I use sbt. I then get the following exception

java.lang.ClassNotFoundException: scala.runtime.LambdaDeserialize

I read a lot about the importance of having the good version of spark with the good version of scala

my build.sbt is:

name := "simple-app"
version := "1.0"
scalaVersion := "2.12.8"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.3"    

my code is:

import org.apache.spark.sql.SparkSession

object SimpleApp {
  def main(args: Array[String]) {
    print("Hello !\r\n")
    val logFile = "D:/bin/spark/README.md" // Should be some file on your system
    val spark = SparkSession.builder.appName("Simple Application").getOrCreate()
    val logData = spark.read.textFile(logFile).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println(s"Lines with a: $numAs, Lines with b: $numBs")
    spark.stop()
  }
}

if I comment

    //val numAs = logData.filter(line => line.contains("a")).count()
    //val numBs = logData.filter(line => line.contains("b")).count()
    //println(s"Lines with a: $numAs, Lines with b: $numBs")

I well read Hello ! in my console.

I run each line of main in spark-shell no error at all.

The corresponding python script is successfully submited

What do I miss ?

tschmit007
  • 7,559
  • 2
  • 35
  • 43

1 Answers1

1

Finally it is effectively a Spark/Scala versions compatibility problem

from https://spark.apache.org/downloads.html one can read:

Note that, Spark is pre-built with Scala 2.11 except version 2.4.2, which is pre-built with Scala 2.12

What I miss read and understand from 2.4.2 version of spark. Beside the quick start stands :

scalaVersion := "2.12.8"

Finally I check the jars directory of my spark installation and I found:

scala-compiler-2.11.12.jar
scala-library-2.11.12.jar
...

So I change my built.sbt to:

scalaVersion := "2.11.12"

and fix the problem (this one at least).

tschmit007
  • 7,559
  • 2
  • 35
  • 43