2

My code can run normally in idea by local mode, When I printed it into a jar package and uploaded it to the SPARK server I deployed to run, NoSuchMethodError: scala. Predef $. refArrayOps appeared. The line of code that went wrong is as follows val expectArray=expectVertex.take(2).toArray.sortBy(it=>{it_1}) the expectVertex is a scala map,its key type is graphx.VertexId,its value type is Int

I have also encountered this issue when doing this with Spark simple code, This error occurred when I was using a row of an array function,the code like this package org.example

import org.apache.spark.graphx.{Edge, Graph}
import org.apache.spark.{SparkConf, SparkContext}

import java.util.logging.{Level, Logger}

/**
 * Hello world!
 *
 */
class App{
  def run(): Unit ={
    Logger.getLogger("org.apache.spark").setLevel(Level.WARNING)
    Logger.getLogger("org.eclipse.jetty.server").setLevel(Level.OFF)
    val conf = new SparkConf().setAppName("AXU test")
      .setMaster("local")
    val sc = new SparkContext(conf)
    val vertices = sc.parallelize(Array((1L, "A"), (2L, "B"), (3L, "C"), (4L, "D")))
    val edges = sc.parallelize(Array(Edge(1L, 2L, "friend"), Edge(2L, 3L, "follow"), Edge(3L, 4L, "friend")))
    val graph = Graph(vertices, edges)
    val inDegrees = graph.inDegrees
    inDegrees.collect().foreach(println)
    val deg = inDegrees.collect()
    for( i <- 0 to deg.length-1){
      print("this is no." + (i+1) + " point indegree:")
      println("id: " + deg(i)._1 + " value: " + deg(i)._2)
    }
    sc.stop()
  }
}

the log is

Exception in thread "main" java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:65)
    at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
    at org.example.App.run(App.scala:23)
    at org.example.Main$.main(Main.scala:6)
    at org.example.Main.main(Main.scala)

if i remove the code in line no.23, the code is inDegrees.collect().foreach(println) it can work normally. My compiled and running versions of scala are both 2.12.7. It looks like I can't use methods like Array [T]. foreach or Array [T]. sortBy (it=>{it_1}) in jar packages(I used Maven to package the jar). the maven content is below.

    <properties>
        <scala.version>2.12.7</scala.version>
        <spark.version>2.4.4</spark.version>
    </properties>


    <build>
        <plugins>
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.2.2</version>
                <executions>
                    <execution>
                        <id>compile-scala</id>
                        <phase>compile</phase>
                        <goals>
                            <goal>add-source</goal>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                    <execution>
                        <id>test-compile-scala</id>
                        <phase>test-compile</phase>
                        <goals>
                            <goal>add-source</goal>
                            <goal>testCompile</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <scalaVersion>${scala.version}</scalaVersion>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.8.0</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <archive>
                        <manifest>
                            <mainClass>org.example.Main</mainClass>
                        </manifest>
                    </archive>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>assembly</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>

            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>exec-maven-plugin</artifactId>
                <version>1.6.0</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>exec</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <executable>java</executable>
                    <includeProjectDependencies>true</includeProjectDependencies>
                    <includePluginDependencies>false</includePluginDependencies>
                    <classpathScope>compile</classpathScope>
                    <mainClass>org.example.Main</mainClass>
                </configuration>
            </plugin>
        </plugins>
    </build>
</project>

Can someone tell me why appear this problem? Thank you in advance.

axu
  • 21
  • 2

1 Answers1

0

Most probably you're compiling your code locally with Scala 2.12 but at the server it's running with Scala 2.13 or 2.11.

Try to recompile your code with the version of Scala at the server.

Scala 2.11, 2.12, 2.13 are binary incompatible.

The signature of refArrayOps is different (in binary incompatible way)

  • in Scala 2.13

def refArrayOps(scala.Array[scala.Any]): scala.Any (scalap)

public <T> T[] refArrayOps(T[]) (javap) scalap and javap showing different method signature

@inline implicit def refArrayOps[T <: AnyRef](xs: Array[T]): ArrayOps[T] [api] [sources]

  • in Scala 2.12

def refArrayOps(scala.Array[scala.Any]): scala.Array[scala.Any] (scalap)

public <T> T[] refArrayOps(T[]) (javap)

implicit def refArrayOps[T <: AnyRef](xs: Array[T]): ArrayOps.ofRef[T] [api] [sources]

  • in Scala 2.11-

def refArrayOps(scala.Array[scala.Any]): scala.collection.mutable.ArrayOps (scalap)

public <T> scala.collection.mutable.ArrayOps<T> refArrayOps(T[]) (javap)

implicit def refArrayOps[T <: AnyRef](xs: Array[T]): ArrayOps[T] [api 2.11 2.10] [sources 2.11 2.10 2.9]


Kafka start error on MAC .. something related to java and scala ... NoSuchMethodError: scala.Predef$.refArrayOps

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps

How do I fix a NoSuchMethodError?

java.lang.NoSuchMethodError: org.apache.hadoop.hive.common.FileUtils.mkdir while trying to save a table to Hive

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps in Spark job with Scala


You can run

import java.net.URLClassLoader
import java.util.Arrays

//  List.unfold(getClass.getClassLoader) { cl =>
//    val urls = s"classloader: ${cl.getClass.getName}" :: 
//      (cl match {
//        case cl: URLClassLoader =>
//          "classloader urls:" :: 
//            cl.getURLs.map(_.toString).toList
//        case _ => List("not URLClassLoader")
//      })
//    Option.when(cl != null)((urls, cl.getParent))
//  }.flatten.foreach(println)

var cl = getClass.getClassLoader
while (cl != null) {
  println(s"classloader: ${cl.getClass.getName}")
  cl match {
    case cl: URLClassLoader =>
      println("classloader urls:")
      // cl.getURLs.foreach(println) // uses Scala refArrayOps again
      println(Arrays.toString(cl.getURLs.asInstanceOf[Array[Object]])) // pure Java
    case _ =>
      println("not URLClassLoader")
  }
  cl = cl.getParent
}

or

println(
  System.getProperty("java.class.path")
)

(What's the difference between System.getProperty("java.class.path") and getClassLoader.getURLs()?)

inside your actual Spark environment that you're using. Then you'll see your classpath. Whether there are different scala-library, whether there are different _2.11, _2.12, _2.13 dependencies.

https://www.scala-sbt.org/1.x/docs/Howto-Classpaths.html

scalacOptions += "-Ylog-classpath"


scala -version shows Scala installed in the system. It's possible that Scala in the classpath is different.

Why does SBT Show a Different ScalaVersion than My System?

build.sbt does not Work with Different Scala Versions

Dmytro Mitin
  • 48,194
  • 3
  • 28
  • 66
  • **Comments have been [moved to chat](https://chat.stackoverflow.com/rooms/253034/discussion-on-answer-by-dmytro-mitin-run-a-scala-code-jar-appear-nosuchmethoderr); please do not continue the discussion here.** Before posting a comment below this one, please review the [purposes of comments](/help/privileges/comment). Comments that do not request clarification or suggest improvements usually belong as an [answer](/help/how-to-answer), on [meta], or in [chat]. Comments continuing discussion may be removed. – Samuel Liew Apr 07 '23 at 17:47