8

I have a Spark-scala application. I tried to display a simple message - "Hello my App". When I compile it with sbt compile and run it by sbt run it's fine. I displayed my message with success but he display an error; like this:

Hello my application!
16/11/27 15:17:11 ERROR Utils: uncaught error in thread SparkListenerBus,   stopping SparkContext
        java.lang.InterruptedException
     ERROR ContextCleaner: Error in cleaning thread
    java.lang.InterruptedException
     at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:67)
    16/11/27 15:17:11 INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4040
    [success] Total time: 13 s, completed Nov 27, 2016 3:17:12 PM
    16/11/27 15:17:12 INFO DiskBlockManager: Shutdown hook called

I can't understand whether it's fine or not! Also when I try to load my file jar after the run, it displays an error.

My command line look like:

spark-submit "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar

And the error is:

Error: Cannot load main class from JAR file:/root/projectFilms/appfilms
Run with --help for usage help or --verbose for debug output
16/11/27 15:24:11 INFO Utils: Shutdown hook called

Please can you answer me!

kc2001
  • 5,008
  • 4
  • 51
  • 92
sirine
  • 517
  • 3
  • 7
  • 17
  • 1
    Have you tried doing what the error message said? use --help or --verbose on the spark submit to see what the logs have to say. – Derek_M Nov 27 '16 at 16:22
  • AFAIK it should be submitted with `--jars` option like ... `$SPARK_HOME/bin/spark-submit --driver-class-path your jar[s] --jars your jar[s] ` – Ram Ghadiyaram Nov 27 '16 at 16:29
  • can you try this `spark-submit \ --verbose --master local[4] \ --class yourclass yourjar.jar` – Ram Ghadiyaram Nov 27 '16 at 16:38
  • if neither is working `jar -tvf system-of-recommandation_2.11-1.0.jar | grep appfilms ` to check the expected class is there in jar file – Ram Ghadiyaram Nov 27 '16 at 16:40
  • also check my [answer](http://stackoverflow.com/questions/40796818/how-to-append-a-resource-jar-for-spark-submit?noredirect=1&lq=1) – Ram Ghadiyaram Nov 27 '16 at 16:43

2 Answers2

6

The error is due to the fact that the SparkContext is not stopped, this is required in versions higher than Spark 2.x. This should be stopped to prevent this error by SparkContext.stop(), or sc.stop(). Inspiration for solving this error is gained from own experiences and the following sources: Spark Context, Spark Listener Bus error

Paul Velthuis
  • 325
  • 4
  • 15
1

You forgot to use --class Parameter spark-submit "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar

spark-submit --class "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar.

Please note if appfilm belong to any package dont forgot to add package name as below packagename.appfilms

I believe this will suffice