0

I want to parse a json string in our spark Project. I use IDEA. If i run my scala code in IDEA, everything runs well. But When i pack a jar by IDEA+SBT and submit this jar to cluster. Error happpened: Exception in thread "main" Java.lang.NoClassDefFoundError: play/api/libs/json/JSObject

If i not use extra dependency jar, submit to cluster run well.

My pack operation procedure:

File -> Project Structure 
     -> Artifacts 
     -> "+" 
     -> JAR 
     -> From modules with dependencies 
     -> select "extract to the target JAR" and set "Main Class" 
     -> Bulid 
     -> Build Artifacts 
     -> Build

screenshot:

Artifacts

MANIFEST

spark-Error

Chuang
  • 25
  • 2
  • 10
  • define your dependency in sbt rather than adding it. – koiralo Mar 09 '18 at 13:32
  • I add this " libraryDependencies += "com.typesafe.play" %% "play-json" % "2.6.7" " but it still not work. – Chuang Mar 09 '18 at 13:46
  • can you add this plugin: https://github.com/jrudolph/sbt-dependency-graph and show the dependency tree? – lev Mar 09 '18 at 13:59
  • I added.But can not use because " Unknown artifact. Not resolved or index" i try to solve this by [link](https://stackoverflow.com/questions/41372978/unknown-artifact-not-resolved-or-indexed-error-for-scalatest) but useless – Chuang Mar 09 '18 at 14:36
  • Updating the project will download the plugin, the warning just means that the plugin might not have been downlaoded yet. – Justin Kaeser Mar 09 '18 at 16:45

0 Answers0