I have a scala project which I have building and running using sbt package and sbt run so far. Now I need to port them to another machine. Hence, I can see that JAR file is created under
$PROJECT_HOME/target/scala-2.9.3/My-app_2.9.3-1.0.jar
But, when I try to run them using,
java -jar target/scala-2.9.3/My-app_2.9.3-1.0.jar
Error message goes like this,
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkContext
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2451)
at java.lang.Class.getMethod0(Class.java:2694)
at java.lang.Class.getMethod(Class.java:1622)
at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkContext
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
I get java.lang.NoClassDefFoundError: org/apache/spark/SparkContext exception. I know that NoClassDefFoundError occurs generally if the JAR can't find/load the definition for the class. But in my sbt I have included the classes,
name := "My App"
version := "1.0"
scalaVersion := "2.9.3"
libraryDependencies += "org.apache.spark" %% "spark-core" % "0.8.0-incubating"
Any pointers on the reasons for error would be appreciated ? Thanks!