Which Scala version works with Spark 2.2.0 ? I'm getting following error:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
Which Scala version works with Spark 2.2.0 ? I'm getting following error:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
Spark 2.2.0 is built and distributed to work with Scala 2.11 by default. To write applications in Scala, you will need to use a compatible Scala version (e.g. 2.11.X). And your scala version might be 2.12.X. That's why it is throwing exception.
To select appropriate scala version for your spark application one could run spark-shell
on the target server. Desired scala version is contained in the welcome message:
Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.2.0.2.6.3.0-235 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_152)
It is 2.11.8
in my Spark distributive.
Also there are pages on MVN repository contained scala version for one's spark distribution:
https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.12
Spark 2.2.0 needs Java 8+ and scala 2.11. Thats about the version info.
But, looking at your error "Exception in thread "main" java.lang.NoSuchMethodError: ", it seems your Spark is unable to find the driver class.
Probably you should be looking in this direction rather than versions.