-1

My scala 12.12.6 code is as follows:

enter image description here

res.forEach(elem=> {val matcher= pattern.matcher(elem.getValue.render().replace("\"",""))
                    query+=","+(if (matcher.matches()) "'"+matcher.group().replace("$","")+"'"+" as "+elem.getKey.replace("\"","`")
                      else elem.getValue.render().replace("\"","")+" as "+elem.getKey.replace("\"","`")
  )})
//query= "select "+query.substring(1)+ " from final_table where dt='"+today+"'"
create_query= "select "+query.substring(1)+ s" from $final_table swv where swv.dt='$today'"+s" and swv.cveid IN ($create_query"
create_query="select "+query.substring(1)+" from default.secureworks_vulnerabilities where dt='2020-01-29'"
  println("Create Query: "+create_query)
val df = spark.sql(create_query)
println("Number of rows selected: "+df.count())

if (df.count()>0) {
  df.show()
  val createJSON = CreateJSON
  val masterJSON = new JSONObject()
  val masterJSONArray = new JSONArray()
  try {
    df.collect().foreach(row => {
      val fields = createJSON.generate(row)
      masterJSONArray.put(fields)
  })
  }
  catch {
    case exp: SparkException => println("Exception raised :" + exp.getMessage)
      System.exit(1)
  }

This was running fine few days ago in spark 2.4; scala 2.11.12 But now, I see this run time error. How should i avoid this? I am building the jar with scala-sdk-2.12 library in Intellij

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
    at com.bofa.gis.App$.main(App.scala:81)
    at com.bofa.gis.App.main(App.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:673)
20/02/04 23:20:09 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
    at com.bofa.gis.App$.main(App.scala:81)
    at com.bofa.gis.App.main(App.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:673)
)
Prasad Khode
  • 6,602
  • 11
  • 44
  • 59

1 Answers1

1

That error looks like you need to update your Spark dependency version, in your build.sbt or related files (wherever you specify dependencies), to match the version of Scala you're using. The artifact ID would change from spark-core_2.11 to spark-core_2.12.

Spark is compiled once for Scala 2.11.X and another time for Scala 2.12.X for more recent releases. See https://mvnrepository.com/artifact/org.apache.spark/spark-core and note the third column containing different resources per Scala version.

  • If the dependency scope is "provided", this means your jar does not contain the dependency. Rather, you must ensure the jars in $SPARK_HOME in your cluster or machine all contain the version of Spark compiled using the matching version of Scala. The spark-submit command will reference these in the classpath.
  • You may also want to see dependency docs which explain how to automatically pick up the matching Scala version, by using %% in library dependencies.
  • Another approach is to view dependency choices by a plugin such as Coursier, e.g. sbt 'coursierDependencyTree' For example, if this tool displays something like org.scala-lang:scala-library:2.11.12 -> 2.12.8 then the version chosen for the sub-dependency is incompatible with the dependency.
ELinda
  • 2,658
  • 1
  • 10
  • 9