1

My spark application depends on adam_2.11-0.20.0.jar, every time I have to package my application with adam_2.11-0.20.0.jar as a fat jar to submit to spark.

for example, my fat jar is myApp1-adam_2.11-0.20.0.jar,

It's ok to submit as following

spark-submit --class com.ano.adam.AnnoSp myApp1-adam_2.11-0.20.0.jar

It reported Exception in

thread "main" java.lang.NoClassDefFoundError:

org/bdgenomics/adam/rdd using --jars

spark-submit --class com.ano.adam.AnnoSp myApp1.jar --jars adam_2.11-0.20.0.jar

My question is how to submit using 2 separate jars without package them together

spark-submit --class com.ano.adam.AnnoSp myApp1.jar adam_2.11-0.20.0.jar
user3190018
  • 890
  • 13
  • 26
Bigo
  • 93
  • 1
  • 12

1 Answers1

2

Add all jars in one folder and then do like below...

Option 1 :

I think Better way of doing this is

$SPARK_HOME/bin/spark-submit \
--driver-class-path  $(echo /usr/local/share/build/libs/*.jar | tr ' ' ',') \
--jars $(echo /usr/local/share/build/libs/*.jar | tr ' ' ',') 

in this approach, you wont miss any jar by mistake in the classpath hence no warning should come.

Option 2 see my anwer:

spark-submit-jars-arguments-wants-comma-list-how-to-declare-a-directory

Option 3 : If you want to do programmatic submit by adding jars through API its possible.Here Im not going to details of it.

Community
  • 1
  • 1
Ram Ghadiyaram
  • 28,239
  • 13
  • 95
  • 121