4

I am running a word count program in spark but i am getting the below error I have added scala-xml_2.11-1.0.2.jar

    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    16/12/16 05:14:02 INFO SparkContext: Running Spark version 2.0.2
    16/12/16 05:14:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    16/12/16 05:14:03 WARN Utils: Your hostname, ubuntu resolves to a loopback address: 127.0.1.1; using 192.168.59.132 instead (on interface ens33) 
    16/12/16 05:14:03 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
    16/12/16 05:14:04 INFO SecurityManager: Changing view acls to: hadoopusr
    16/12/16 05:14:04 INFO SecurityManager: Changing modify acls to: hadoopusr
    16/12/16 05:14:04 INFO SecurityManager: Changing view acls groups to: 
    16/12/16 05:14:04 INFO SecurityManager: Changing modify acls groups to: 
    16/12/16 05:14:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(hadoopusr); groups with view permissions: Set(); users  with modify permissions: Set(hadoopusr); groups with modify permissions: Set()
    16/12/16 05:14:05 INFO Utils: Successfully started service 'sparkDriver' on port 40559.
    16/12/16 05:14:05 INFO SparkEnv: Registering MapOutputTracker
    16/12/16 05:14:05 INFO SparkEnv: Registering BlockManagerMaster
    16/12/16 05:14:05 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-0b830180-ae51-451f-9673-4f98dbaff520
    16/12/16 05:14:05 INFO MemoryStore: MemoryStore started with capacity 433.6 MB
    16/12/16 05:14:05 INFO SparkEnv: Registering OutputCommitCoordinator
    Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$;
        at org.apache.spark.ui.jobs.StagePage.<init>(StagePage.scala:44)
        at org.apache.spark.ui.jobs.StagesTab.<init>(StagesTab.scala:34)
        at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:62)
        at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:219)
        at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:161)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:440)
        at LearnScala.WordCount$.main(WordCount.scala:15)
        at LearnScala.WordCount.main(WordCount.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
    16/12/16 05:14:05 INFO DiskBlockManager: Shutdown hook called
    16/12/16 05:14:05 INFO ShutdownHookManager: Shutdown hook called
    16/12/16 05:14:05 INFO ShutdownHookManager: Deleting directory /tmp/spark-789e9a76-894f-468b-a39a-cf00da30e4ba/userFiles-3656d5f8-25ba-45c4-b2f6-9f654a049bb1
    16/12/16 05:14:05 INFO ShutdownHookManager: Deleting directory /tmp/spark-789e9a76-894f-468b-a39a-cf00da30e4ba

I am using the below versions

build.SBT:

name := "SparkApps"

version := "1.0"

scalaVersion := "2.11.5"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.2"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "2.0.2"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.10
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "2.0.2"
// https://mvnrepository.com/artifact/org.apache.spark/spark-yarn_2.11
libraryDependencies += "org.apache.spark" % "spark-yarn_2.10" % "2.0.2"

Spark version: 2.0.2

T. Gawęda
  • 15,706
  • 4
  • 46
  • 61
Imran Rahaman
  • 41
  • 1
  • 2

1 Answers1

5

I am running a word count program in spark but i am getting the below error I have added scala-xml_2.11-1.0.2.jar

Later we can see:

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.2"

Choose one ;) Scala 2.10 or Scala 2.11. Change Scala-XML version to 2.10 or Spark to 2.11. From Spark 2.0, Scala 2.11 is recommended.

You can easily add proper Scala versions by adding %% in build.sbt:

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.2"

Secondly, in build.sbt there is no information about Scala-XML dependecy - you should add it.

At last, you must add all 3rd party jars to spark-submit via --jars option or build uber jar - see this question

Community
  • 1
  • 1
T. Gawęda
  • 15,706
  • 4
  • 46
  • 61