4

I'm new to Scala and Spark. I've been frustrated by how hard it has been to get things to work with IntelliJ. Currently, I can't get run the code below. I'm sure it's something simple, but I can't get it to work.

I'm trying to run:

import org.apache.spark.{SparkConf, SparkContext}

object TestScala {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf()
    conf.setAppName("Datasets Test")
    conf.setMaster("local[2]")
    val sc = new SparkContext(conf)
    println(sc)
  }
}

The error I get is:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1413)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:77)
at TestScala$.main(TestScala.scala:13)
at TestScala.main(TestScala.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)

My build.sbt file:

name := "sparkBook"

version := "1.0"

scalaVersion := "2.12.1"
Mridang Agarwalla
  • 43,201
  • 71
  • 221
  • 382
lars
  • 1,976
  • 5
  • 33
  • 47
  • I am here to help, Don't get frustrated. Show me your `build.sbt` ? – Nagarjuna Pamu Dec 12 '16 at 21:13
  • How are you running it? What kind of project have you set it up as? How does your `build.sbt` look like? Literally, it didn't take more than a minute for me to create a new project, copy your code, change the print statement to `println("AppName: " + sc.appName)`, run it, and see the expected output - which is `AppName: Datasets Test`. – The_Tourist Dec 12 '16 at 21:14
  • @YoungSpice I figured out what I was doing wrong. Fixed that; now get different error. Changed my post. Running it as a scala script. – lars Dec 12 '16 at 21:15
  • @pamu I get different error now. It doesn't recognize a method. Added sbt build file. – lars Dec 12 '16 at 21:17
  • @lars your `build.sbt` does not contain spark dependencies. why ? – Nagarjuna Pamu Dec 12 '16 at 21:21
  • @lars right way is 1) open intellij 2) create new sbt project 3) go to `build.sbt` and add apache spark as dependency and then 4) write a main method with your code 5) run using `sbt run` – Nagarjuna Pamu Dec 12 '16 at 21:23
  • I'm following (https://www.ibm.com/developerworks/community/files/basic/anonymous/api/library/e5c0146d-f723-446b-9151-c31d4c56ed01/document/b41505ac-141b-45a2-84cd-1b6a8d5ae653/media/Setting%20up%20spark%202.0%20with%20intellij%20community%20edition.pdf – lars Dec 12 '16 at 21:24
  • https://stackoverflow.com/questions/75947449/run-a-scala-code-jar-appear-nosuchmethoderrorscala-predef-refarrayops – Dmytro Mitin Apr 07 '23 at 04:55

3 Answers3

5

Change your scalaVersion to 2.11.8 and add the Spark dependency to your build.sbt:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.2"

The_Tourist
  • 2,048
  • 17
  • 21
  • It's a bit more idiomatic to use `libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.2"` isn't it? – Angelo Genovese Dec 12 '16 at 21:25
  • @AngeloGenovese You're right, it is more idiomatic but I was a bit hesitant to suggest that because if the Scala version is changed to 2.12, the build would fail if Spark hasn't been published for that version. At least from what I've seen, this situation hasn't been easy to debug if the developer isn't aware of the `%%` vs `%` semantics. – The_Tourist Dec 12 '16 at 22:02
  • Yeah maybe, but then I run into even a bigger problem with this weird error: https://github.com/sbt/zinc/issues/276 where the "compiler-bridge_2.11" package could not be compiled. Scala 2.11.0, 2.11.8, 2.11.11 did not work, but 2.11.12 worked! – seb Aug 27 '19 at 15:45
2

One more scenario is intellij is pointing to 2.12.4 and all the maven/sbt dependencies are 2.11.8. with scala dep verion 2.11...

I stepped back from 2.12.4 to 2.11.8 at global libraries of intellij ui. and it started working

Details :

Maven pom.xml pointing to 2.11.8 But in my Intellij... sdk is 2.12.4 in global libraries shown below. Which is causing

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

enter image description here Stepped back to 2.11.8 in Global libraries.. like below

enter image description here Thats it.. Problem solved. No more error for executing that program.

Conclusion : Maven dependencies alone should not solve the problem, along with that we have to configure scala sdk in global libraries since its error is coming while running a spark local program and error is related to Intellij run time.

Ram Ghadiyaram
  • 28,239
  • 13
  • 95
  • 121
0

If you use spark 2.4.3, you need to use scala 2.11 even though spark website says to use scala 2.12. https://spark.apache.org/docs/latest/

To avoid scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;

Miae Kim
  • 1,713
  • 19
  • 21