-2

So the problem that I am having is that I don't seem to be able to create a sparkcontext. And I have no idea why not.

Here is my code:

import org.apache.spark.{SparkConf, SparkContext}

object spark_test{
 def main(args: Array[String]): Unit = {
 val conf = new SparkConf().setAppName("Datasets Test").setMaster("local")
 val sc= new SparkContext(conf)
 println(sc)
 }
}

And here is the result that I am getting:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.spark.SparkConf.getAkkaConf(SparkConf.scala:203)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:68)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
at spark_test$.main(test.scala:6)
at spark_test.main(test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)

Any thoughts?

Ryan
  • 259
  • 3
  • 10
  • 1
    Which Spark version and which Scala version are you using? Add your SBT/Maven dependencies – Yuval Itzchakov Dec 22 '16 at 21:19
  • Looks like a version mismatch – evan.oman Dec 22 '16 at 21:27
  • Scala 2.12.1 org.apache.spark:spark-core_2.10:0.9.2 org.apache.spark:spark-graphx_2.10:0.9.2 org.apache.spark:spark-mllib_2.10:0.9.2 org.apache.spark:spark-sql_2.10:1.1.1 org.apache.spark:spark-streaming_2.10:0.9.2 – Ryan Dec 22 '16 at 21:27
  • 1
    your version of scala and version of spark need to match, with that spark core you need to run scala 2.10 – Angelo Genovese Dec 22 '16 at 21:32
  • Spark 0.9.2? That's quite old and very unstable. Looks like you're getting started with Spark. Any reasons not to use the latest version? – maasg Dec 22 '16 at 22:45
  • https://stackoverflow.com/questions/75947449/run-a-scala-code-jar-appear-nosuchmethoderrorscala-predef-refarrayops – Dmytro Mitin Apr 07 '23 at 05:00

1 Answers1

0

Your scala version is too new and spark-core version is too old . I am using scala 2.11.8 and spark-core_2.11:2.0.1,you can try it!

Hatter Bush
  • 215
  • 1
  • 3
  • 15