0

I am using sbt package to create a .jar file and submit to spark-submit

if I use

package mygraph
import mygraph._
object GApp {
    def main ...

Then it throws ClassNotFoundException: GApp, but if I delete the first line and change to

import mygraph._
object GApp {
    def main ...

then it can work, why?

My build.sbt is as follows

name := "ag"

version := "1.0"

artifactName := { (sv, md, art) => "g.jar" }

scalaVersion := "2.11.8"

assemblyJarName in assembly := "G.jar"

resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-sql" % "2.0.1" % "provided",
  "org.apache.spark" %% "spark-graphx" % "2.0.1" % "provided",
  "neo4j-contrib" % "neo4j-spark-connector" % "2.0.0-M2" % "provided"
)
Litchy
  • 355
  • 1
  • 4
  • 18
  • can you post your sbt definition as well? – prayagupa May 17 '18 at 03:40
  • @prayagupd do you mean `build.sbt`? I added my `build.sbt` just now – Litchy May 17 '18 at 03:44
  • 1
    Ok, you might want to add `mainClass in (Compile, run) := Some("mygraph.GApp")` so that jar knows which main to run. I believe you fire `java -jar G.jar` to run the jar. Or else you can do `java -cp G.jar mygraph.GApp` - See [How to set main class in build?](https://stackoverflow.com/a/18198743/432903) – prayagupa May 17 '18 at 03:50
  • Which sbt version are you using? And please try upgrading the Scala version – Raman Mishra May 17 '18 at 03:58
  • @RamanMishra So what you mean is that `package` should not affect the use of the program? – Litchy May 17 '18 at 04:04
  • @prayagupd thanks! I think this tutorial would help me – Litchy May 17 '18 at 04:04

0 Answers0