I installed spark 2.4.3 and sbt 1.2.8. I'm under windows 10 pro.
java -version
gives:
java version "1.8.0_211"
Java(TM) SE Runtime Environment (build 1.8.0_211-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.211-b12, mixed mode)
I try to execute the Quick Start from spark.apache.org
All runs fine until I use sbt. I then get the following exception
java.lang.ClassNotFoundException: scala.runtime.LambdaDeserialize
I read a lot about the importance of having the good version of spark with the good version of scala
my build.sbt is:
name := "simple-app"
version := "1.0"
scalaVersion := "2.12.8"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.3"
my code is:
import org.apache.spark.sql.SparkSession
object SimpleApp {
def main(args: Array[String]) {
print("Hello !\r\n")
val logFile = "D:/bin/spark/README.md" // Should be some file on your system
val spark = SparkSession.builder.appName("Simple Application").getOrCreate()
val logData = spark.read.textFile(logFile).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println(s"Lines with a: $numAs, Lines with b: $numBs")
spark.stop()
}
}
if I comment
//val numAs = logData.filter(line => line.contains("a")).count()
//val numBs = logData.filter(line => line.contains("b")).count()
//println(s"Lines with a: $numAs, Lines with b: $numBs")
I well read Hello !
in my console.
I run each line of main
in spark-shell
no error at all.
The corresponding python script is successfully submited
What do I miss ?