0

I am running into version issues, please direct some doc how to check version compatibility, trying to set environment


plugins.sbt

addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "4.0.0")

build.sbt
name := "wcount"
version := "1.0"
scalaVersion := "2.10.5"

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "1.6.0",
  "com.typesafe" % "config" % "1.3.0"
    )
----------------------------------------
 scala -version
Scala code runner version 2.10.5 -- Copyright 2002-2013, LAMP/EPFL
----------------------------------------

Spark version 1.6.0

Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.7.0_95)

sbt run input.txt output.txt import org.apache.spark.SparkContext, org.apache.spark.SparkConf

object WordCount {
  def main(args: Array[String]) {
    val conf = new SparkConf().
      setAppName("Word Count").
      setMaster("local")
    val sc = new SparkContext(conf)
    val inputPath = args(0)
    val outputPath = args(1)

    val wc = sc.textFile(inputPath).
      flatMap(rec => rec.split(" ")).
      map(rec => (rec, 1)).
      reduceByKey((acc, value) => acc + value)

    wc.saveAsTextFile(outputPath)

  }
}
---------------------------------------------

getting error

[info] Loading global plugins from /root/.sbt/0.13/plugins
[info] Set current project to wcount (in build file:/root/spark/wcount/)
[info] Running WordCount
[error] (run-main-0) java.lang.NoSuchMethodError:             scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
java.lang.NoSuchMethodError:     scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
    at org.apache.spark.util.Utils$.getSystemProperties(Utils.scala:1546)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:59)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:53)
    at WordCount$.main(WordCount.scala:5)
    at WordCount.main(WordCount.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at     sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at     sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java    :43)
    at java.lang.reflect.Method.invoke(Method.java:606)
[trace] Stack trace suppressed: run last compile:run for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
    at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 1 s, completed Nov 28, 2016 10:04:20 PM

Ravinder Karra
  • 307
  • 1
  • 3
  • 8
  • Also tried libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" % "1.6.0", "com.typesafe" % "config" % "1.3.0" ) – Ravinder Karra Nov 28 '16 at 22:26

1 Answers1

0

You've set spark core libs to 2.11

"org.apache.spark" % "spark-core_2.11" % "1.6.0",

change it to 2.10 and try again.

harschware
  • 13,006
  • 17
  • 55
  • 87
  • After above update , Did not change error message sbt run /user/data/wordcount_input.txt /user/data/wordcount_output.txt [info] Loading global plugins from /root/.sbt/0.13/plugins [info] Set current project to wcount (in build file:/root/spark/wcount/) [info] Running WordCount Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 16/11/28 23:46:07 INFO SparkContext: Running Spark version 1.6.0 [error] (run-main-0) java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class – Ravinder Karra Nov 28 '16 at 23:49
  • Get yourself to a state where you have a successful compilation and work yourself forward. In other words, simplest code possible. Start with just `sc.textFile(inputPath)` and remove everything after. – harschware Nov 28 '16 at 23:56
  • BTW: you are making progress as you were seeing `java.lang.NoSuchMethodError` and now you are seeing ` java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class` when deciding to what scala version to use 2.11 or 2.10 or 2.9 you have to know what your API is compatible to. What version of Spark are you running? see also: http://stackoverflow.com/questions/36050341/apache-spark-exception-in-thread-main-java-lang-noclassdeffounderror-scala-co – harschware Nov 28 '16 at 23:58