When I ran with spark-submit for the following simple Spark program of:
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext
import org.apache.spark._
import SparkContext._
object TEST2{
def main(args:Array[String])
{
val conf = new SparkConf().setAppName("TEST")
val sc=new SparkContext(conf)
val list=List(("aa",1),("bb",2),("cc",3))
val maps=list.toMap
}
}
I got java.lang.NoSuchMethodError for the line of "val maps=list.toMap". But in a spark-shell or simply scala, it has no problem:
scala> val list=List(("aa",1),("bb",2),("cc",3))
list: List[(String, Int)] = List((aa,1), (bb,2), (cc,3))
scala> val maps=list.toMap
maps: scala.collection.immutable.Map[String,Int] = Map(aa -> 1, bb -> 2, cc -> 3)
So to use "toMap" method, what am I missing in spark-submit? I use "sbt package" to compile the program and without problem. Thanks!
P.S: the build.sbt file is as:
name := "TEST2"
version := "1.0"
scalaVersion := "2.11.6"