I'm trying to create a SparkContext in an Intellij 14 Scala Worksheet.
here are my dependencies
name := "LearnSpark"
version := "1.0"
scalaVersion := "2.11.7"
// for working with Spark API
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0"
Here is the code i run in the worksheet
import org.apache.spark.{SparkContext, SparkConf}
val conf = new SparkConf().setMaster("local").setAppName("spark-play")
val sc = new SparkContext(conf)
error
15/08/24 14:01:59 ERROR SparkContext: Error initializing SparkContext.
java.lang.ClassNotFoundException: rg.apache.spark.rpc.akka.AkkaRpcEnvFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
When I run Spark as standalone app it works fine. For example
import org.apache.spark.{SparkContext, SparkConf}
// stops verbose logs
import org.apache.log4j.{Level, Logger}
object TestMain {
Logger.getLogger("org").setLevel(Level.OFF)
def main(args: Array[String]): Unit = {
//Create SparkContext
val conf = new SparkConf()
.setMaster("local[2]")
.setAppName("mySparkApp")
.set("spark.executor.memory", "1g")
.set("spark.rdd.compress", "true")
.set("spark.storage.memoryFraction", "1")
val sc = new SparkContext(conf)
val data = sc.parallelize(1 to 10000000).collect().filter(_ < 1000)
data.foreach(println)
}
}
Can someone provide some guidance on where I should look to resolve this exception?
Thanks.