1

I have written a simple spark app in scala using idea, and got the error message when run it :

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1406)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:76)
at com.chandler.hellow_world_b6$.main(hellow_world_b6.scala:13)
at com.chandler.hellow_world_b6.main(hellow_world_b6.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)

Process finished with exit code 1 and the code is :

import org.apache.spark.{SparkContext,SparkConf}
object hellow_world_b6{
    def main(args: Array[String]): Unit = {
        println( "Hello World   12!")
        val conf=new SparkConf()
        val sc=new SparkContext(conf)
    }
}

maven configure is :

<properties>
    <scala.version>2.12.1</scala.version>
</properties>
<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>${scala.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.1.0</version>
</properties>
chandler
  • 11
  • 3
  • 1
    Possible duplicate of [java.lang.NoSuchMethodError: scala.Predef$.refArrayOps](http://stackoverflow.com/questions/40328948/java-lang-nosuchmethoderror-scala-predef-refarrayops) – Tzach Zohar Feb 01 '17 at 16:22
  • You're using two different Scala versions - `spark-core_2.11` uses 2.11 and you're importing Scala `2.12.1` - align the versions (use Spark's version or build spark deps with 2.12) – Tzach Zohar Feb 01 '17 at 16:24
  • so how can i fix it ? install scala 2.11 or just change 2.12.1 to 2.11, i have already change the scala version specifying to 2.11. but the do not fixed the issue, BTW, i am new to all off those : java, scala, spark *-* – chandler Feb 01 '17 at 16:52
  • changing `` to any minor 2.11 version (e.g. `2.11.8`) should be enough – Tzach Zohar Feb 01 '17 at 16:53
  • thanks a lot@Tzach Zohar. this issue was fixed when i changed the version of scala to 2.11.8. so can i say that the string of number at end of spark core, like spark-core_x.xx, also indicate the scala version I must use ? – chandler Feb 02 '17 at 03:53
  • 1
    Yes, that's exactly what it means – Tzach Zohar Feb 02 '17 at 06:23
  • https://stackoverflow.com/questions/75947449/run-a-scala-code-jar-appear-nosuchmethoderrorscala-predef-refarrayops – Dmytro Mitin Apr 07 '23 at 05:05

0 Answers0