1

I have tried following ways to add jars in my code:

1)

SparkConf conf = new SparkConf().setAppName("myApp")
                 .setMaster(local[*])
                 .setJars(new String[]{"jar1.jar","jar2.jar"});

2)

SparkConf conf = new SparkConf().setAppName("myApp")
                 .setMaster(local[*])
                 .set("spark.jars","path/to/jars");

3)

SparkConf conf = new SparkConf().setAppName("myApp")
                 .setMaster(local[*]);
JavaSparkContext sc = new JavaSparkContext(conf)
                      .addJar("path/to/jars")`

But still getting java.lang.ClassNotFoundException for third party library classes.

Whereas same code is running fine when I run it with --jars option like ./spark-submit --class com.test.spark.App --jars jar1.jar,jar2.jar main.jar

Spark version : spark-2.0.0-bin-hadoop2.7

Kaiserbogey
  • 191
  • 13
Sam
  • 21
  • 1
  • 4

0 Answers0