1

I use mvn clean package to create a jar package and no error raised. After that I use spark-submit to execute the jar package, following error occurs:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;

I am pretty sure the the above error is raised from the following clause 3 and other clauses are correct:

1-val q1 = Source.fromFile("blackList.sql")
2-val q = q1.mkString.replace("theDate1", theDate1).split(";")
3-for(s <- q){
  println(s);
}
4-q1.close()

The above code segment is trying to traverse Array[String] object q and print it.

The version of spark I am using is 2.4.5,and the scala version in my pom.xml file is 2.12.10. I have tried many different scala verions for this spark version, but still raised the error described in the title.

Here is the related part of my pom.xml file:

<properties>
    <project.version>2.4.5</project.version>
    <java.version>1.8</java.version>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <scala.version>2.12.10</scala.version>
    <scala.main.verison>2.12</scala.main.verison>
    <spark.version>2.4.5</spark.version>
</properties>

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_${scala.main.verison}</artifactId>
        <version>${spark.version}</version>
        <scope>provided</scope>
    </dependency>

    <dependency>
        <groupId>com.typesafe</groupId>
        <artifactId>config</artifactId>
        <version>1.4.1</version>
    </dependency>

    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.12.10</version>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_${scala.main.verison}</artifactId>
        <version>${spark.version}</version>
        <scope>provided</scope>
    </dependency>
</dependencies>

So, what exactly is the problem and how to fix it? Thanks for your help.

Yafei Wei
  • 19
  • 3
  • Thanks very much, this link answers my question. I switch scala in IntelliJ from version 2.12.10 to version 2.11.12(spark uses scala 2.11.2), then everything works! – Yafei Wei Aug 18 '21 at 08:24
  • https://stackoverflow.com/questions/75947449/run-a-scala-code-jar-appear-nosuchmethoderrorscala-predef-refarrayops – Dmytro Mitin Apr 07 '23 at 04:46

0 Answers0