0

I have a scala project which I have building and running using sbt package and sbt run so far. Now I need to port them to another machine. Hence, I can see that JAR file is created under

$PROJECT_HOME/target/scala-2.9.3/My-app_2.9.3-1.0.jar

But, when I try to run them using,

java -jar target/scala-2.9.3/My-app_2.9.3-1.0.jar

Error message goes like this,

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkContext
    at java.lang.Class.getDeclaredMethods0(Native Method)
    at java.lang.Class.privateGetDeclaredMethods(Class.java:2451)
    at java.lang.Class.getMethod0(Class.java:2694)
    at java.lang.Class.getMethod(Class.java:1622)
    at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkContext
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:356)

I get java.lang.NoClassDefFoundError: org/apache/spark/SparkContext exception. I know that NoClassDefFoundError occurs generally if the JAR can't find/load the definition for the class. But in my sbt I have included the classes,

name := "My App"

version := "1.0"

scalaVersion := "2.9.3"

libraryDependencies += "org.apache.spark" %% "spark-core" % "0.8.0-incubating"

Any pointers on the reasons for error would be appreciated ? Thanks!

om-nom-nom
  • 62,329
  • 13
  • 183
  • 228
Learner
  • 1,685
  • 6
  • 30
  • 42

1 Answers1

3

You have to either build fat jar which includes all your dependencies (in your case spark) or manually add spark artifact to the classpath. In the first case you will likely have to use onejar or sbt-assembly plugin for sbt.

om-nom-nom
  • 62,329
  • 13
  • 183
  • 228
  • How can I add spark to the classpath ? – Learner Oct 25 '13 at 09:29
  • @Learner you have to go to `~/.ivy2/cache/` (it's the place where sbt stores all resolved artifacts) and find appropriate jar in subdirectory (structure should be the same as artifact description, e.g. *~/.ivy2/cache/org/apache/spark/spark-core/*, jar will be named as 0.8.0-incubating.jar. Now you have to use [classpath argument](http://stackoverflow.com/questions/219585/setting-multiple-jars-in-java-classpath) to java call – om-nom-nom Oct 25 '13 at 10:10
  • 4
    You might want to set `retrieveManaged := true` which will cause sbt to copy all the needed jars into the `lib_managed/jars` directory, so you don't have to go groping around in `~/.ivy2/cache/` for them. – Seth Tisue Oct 25 '13 at 11:38
  • You can also use sbt-native-packager and the java-application archetype to generate a script for you (with the `stage` task) that will allow you to run with thin jars. – jsuereth Oct 26 '13 at 13:37