62

I have the following class:

import scala.util.{Success, Failure, Try}


class MyClass {

  def openFile(fileName: String): Try[String]  = {
    Failure( new Exception("some message"))
  }

  def main(args: Array[String]): Unit = {
    openFile(args.head)
  }

}

Which has the following unit test:

class MyClassTest extends org.scalatest.FunSuite {

  test("pass inexistent file name") {
    val myClass = new MyClass()
    assert(myClass.openFile("./noFile").failed.get.getMessage == "Invalid file name")
  }

}

When I run sbt test I get the following error:

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
        at org.scalatest.tools.FriendlyParamsTranslator$.translateArguments(FriendlyParamsTranslator.scala:174)
        at org.scalatest.tools.Framework.runner(Framework.scala:918)
        at sbt.Defaults$$anonfun$createTestRunners$1.apply(Defaults.scala:533)
        at sbt.Defaults$$anonfun$createTestRunners$1.apply(Defaults.scala:527)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.immutable.Map$Map1.foreach(Map.scala:109)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
        at scala.collection.AbstractTraversable.map(Traversable.scala:105)
        at sbt.Defaults$.createTestRunners(Defaults.scala:527)
        at sbt.Defaults$.allTestGroupsTask(Defaults.scala:543)
        at sbt.Defaults$$anonfun$testTasks$4.apply(Defaults.scala:410)
        at sbt.Defaults$$anonfun$testTasks$4.apply(Defaults.scala:410)
        at scala.Function8$$anonfun$tupled$1.apply(Function8.scala:35)
        at scala.Function8$$anonfun$tupled$1.apply(Function8.scala:34)
        at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
        at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:235)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
[error] (test:executeTests) java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

Build definitions:

version := "1.0"

scalaVersion := "2.12.0"

// https://mvnrepository.com/artifact/org.scalatest/scalatest_2.11
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "3.0.0"

I can't figure out what causes this. My class and unit test seem simple enough. Any ideas?

Dmytro Mitin
  • 48,194
  • 3
  • 28
  • 66
bsky
  • 19,326
  • 49
  • 155
  • 270
  • Can you share your build definition as well? – stefanobaghino Oct 30 '16 at 13:24
  • I confirmed your class methods work as expected in a standard scala repl. Must be an issue with the sbt build def. – Brian Pendleton Oct 30 '16 at 13:30
  • 1
    This specific error happens when you use Scala 2.11 JAR files in Scala 2.12 projects. Scalatest is cross compiled with Scala 2.11 and Scala 2.12, so you can avoid this error by leveraging the SBT `%%` operator, as indicated in the accepted question. See my answer to learn more about the SBT `%%` operator and cross compilation, topics all Scala programmers must understand to avoid headaches. – Powers Dec 02 '20 at 14:26
  • 2
    For those using spark, it also matters what scala is in the runtime where you submit. And for those using AWS EMR specifically, they use 2.11 (at least for EMR 5.x.x) even though 2.12 is also compatible with spark 2.4.x. – combinatorist Jan 22 '21 at 21:18
  • https://stackoverflow.com/questions/75947449/run-a-scala-code-jar-appear-nosuchmethoderrorscala-predef-refarrayops – Dmytro Mitin Apr 07 '23 at 04:40

14 Answers14

70

I had SDK in global libraries with a different version of Scala(IntelliJ IDEA).
File -> Project Structure -> Global libraries -> Remove SDK -> Rebuild. It fixed the Exception for me.

Anton Tkachov
  • 731
  • 5
  • 10
  • Another thing that helped in addition to this: reimporting the maven module (if you're using that). Also check Project Structure -> Problems, which may indicate a reference to an invalid/outdated scala SDK library. This happened to me after extended troubleshooting and trying different scala versions. – Patrick Jun 17 '19 at 18:10
  • Worked Like Charm. also add scala sdk again in global dependency – vipin chourasia Apr 14 '21 at 09:41
  • Does it mean we are not allowed to have a few SDKs in Global Libraries? The use case is I have different projects with different scala versions: 2.11 / 2.12. What is the proper way to handle such cases? – iamtodor Apr 05 '22 at 17:56
  • UPD: https://stackoverflow.com/a/44035141/5151861 – iamtodor Apr 05 '22 at 18:23
42

scalatest_2.11 is the version of ScalaTest compatible only with Scala 2.11.x. Write libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test" (note %%) instead to pick the correct version automatically and switch to Scala 2.11.8 until scalatest_2.12 is released (it should be very soon). See http://www.scala-sbt.org/0.13/docs/Cross-Build.html for more.

Alexey Romanov
  • 167,066
  • 35
  • 309
  • 487
15

I used IntelliJ, and just import the project again. I mean, close the open project and import in as Maven or SBT. Note: I select the mvn (import Maven projects automatically) It disappeared.

Shibli
  • 5,879
  • 13
  • 62
  • 126
HHH
  • 311
  • 2
  • 7
  • 2
    or check your scala version , it should be the same as your pom.xml or sbt setting – HHH May 17 '17 at 21:56
8

This error occurs when you use a Scala JAR file that was compiled with Scala 2.11 for a Scala 2.12 project.

Scala libraries are generally cross compiled with different versions of Scala, so different JAR files are published to Maven for different project versions. For example, Scalatest version 3.2.3 publishes separate JAR files to Maven to Scala 2.10, 2.11, 2.12, and 2.13, as you can see here.

Lots of Spark programmers will run into this error when they attach a JAR file that was compiled with Scala 2.11 to a cluster that's running Scala 2.12. See here for a detailed guide on how to migrate Spark projects from Scala 2.11 to Scala 2.12.

As the accepted answer mentioned, the SBT %% operator should be used when specifying Scala dependencies so you can automatically grab library dependencies that correspond with your project's Scala version (as mentioned in the accepted answer). The %% operator won't help you if the library dependency doesn't have a JAR file for the Scala version you're looking for. Look at the Spark releases for example:

spark releases

This build.sbt file will work because there is a Scala 2.12 release for Spark 3.0.1:

scalaVersion := "2.12.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.0.1"

This code will not work because there isn't a Scala 2.11 release for Spark 3.0.1:

scalaVersion := "2.12.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.0.1"

You can cross compile your project and build JAR files for different Scala versions if your library dependencies are also cross compiled. Spark 2.4.7 is cross compiled with Scala 2.11 and Scala 2.12, so you can cross compile your project with this code:

scalaVersion := "2.11.12"
crossScalaVersions := Seq("2.11.12", "2.12.10")
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.7"

The sbt +assembly code will build two JAR files for your project, one that's compiled with Scala 2.11 and another that's compiled with Scala 2.12. Libraries that release multiple JAR files follow a similar process cross compilation workflow.

Powers
  • 18,150
  • 10
  • 103
  • 108
3

In my experience, if you still get errors after matching scalatest version and scala version in build.sbt, you have to think about your actual scala version that is running on your machine. You can check it by $ scala, seeing

Welcome to Scala 2.12.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_121). Type in expressions for evaluation. Or try :help. this type of messages. You need to match this Scala version(eg. 2.12.1 here) and build.sbt's one.

MyounghoonKim
  • 1,030
  • 16
  • 18
3

In my case, the Spark version makes it incompatible. Change to Spark 2.4.0 works for me.

louis l
  • 31
  • 2
  • AWS EMR 5.x.x clusters always use scala 2.11 for spark submit jobs even though 2.12 scala should be a compatible option with spark 2.4.x. – combinatorist Jan 22 '21 at 21:17
2

This was happening to me in DataBricks. Problem was the same as noted by previous answers, the incompatibility with spark and scala version. For DataBricks, I had to change the cluster DataBricks Runtime Version. Default was Scala 2.11/Spark 2.4.5, bump this up to at least Scala 2.12/Spark 3.0.0

Click Clusters > Cluster_Name > Edit > DataBricks Runtime Version

enter image description here

ibaralf
  • 12,218
  • 5
  • 47
  • 69
1

When you are using Spark, Hadoop, Scala, and java, some incompatibilities arise. You can use the version of each one that are compatible with others. I use Spark version: 2.4.1 , Hadoop: 2.7 , java: 9.0.1 and Scala: 2.11.12 they are compatible with each other.

MeirDayan
  • 620
  • 5
  • 20
  • Important to note that starting in Spark version 2.4.2 the default distribution is compiled using Scala 2.12; prior to that 2.11 is used by default. So if you hit this error and you are using 2.11 dependencies in your project, make sure your Spark installation is also built using 2.11 – JMess May 09 '19 at 22:56
0

Try adding the following line to your build.sbt

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"

your build.sbt should be like this:

libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1"

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"

With this, the error for me is solved.

pheeleeppoo
  • 1,491
  • 6
  • 25
  • 29
Yousef Irman
  • 109
  • 4
0

in eclipse ide the project tends to be preselected with the scala installation 'Latest 2.12 bundle (dynamic)' configuration. If you are not actually using 2.12 for your Scala project and you attempt to run your project through the IDE, then this issue will manifest itself.

I've also noticed if I rebuild my eclipse project with the sbt command: "eclipse with-source" that this has the side effect of resetting the eclipse project scala installation back to the 2.12 setting (even though my build.sbt file is configured for a 2.11 version of Scala). So be on the lookout for both of those scenarios.

Andrew Norman
  • 843
  • 9
  • 22
0

In my case, I had a project jar dependency which was depending on a different version of scala. This was found under Project Structure -> Modules -> (selected project) -> Dependencies tab. Everything else in the project and its libraries lined up in scala version (2.12), but the other jar was hiding a transitive dependency on an older version (2.11).

voxoid
  • 1,164
  • 1
  • 14
  • 31
0

I am doing a PoC using Apache Spark-3.1.1 and Apache ignite-2.10, and trying to load data from spark to ignite cluster. But while saving data I am getting the below error.

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

My code is as below:

df.write
            .format(FORMAT_IGNITE)
            .option(OPTION_CONFIG_FILE, CONFIG)
            .option(OPTION_TABLE, "connect")
            .option(OPTION_CREATE_TABLE_PRIMARY_KEY_FIELDS, "id")
            .option(OPTION_CREATE_TABLE_PARAMETERS, "template=replicated").mode(SaveMode.Append)
            .save()
C. Peck
  • 3,641
  • 3
  • 19
  • 36
0

This happens if you use scala 2.12 libraries with scala 2.11 or vice versa.

If cleaning the pom.xml with good packages didn't work. It is because the jars are already downloaded in your local.

Delete this folder ~/.m2/ and reload the packages

Kiran Gali
  • 101
  • 6
0

For anyone encountering this issue with scala in AWS Glue 3

Glue 3 is using spark 3 and scala 2.12 and as the other answers indicate you can't use scala 2.11 jars on a cluster running scala 2.12

So if you were like me and you where trying to use an extra jar file compiled with scala 2.11 (and which worked in earlier versions of glue), you will now get this error and will need to rebuild/change your jar to one using scala 2.12 for use with Glue 3.

See aws migration recommendations: https://docs.aws.amazon.com/glue/latest/dg/migrating-version-30.htmlLscala/collection/mutable/ArrayOps

J.Hammond
  • 251
  • 3
  • 17