0

I am writting a Spark-shell script and for some reason I have been told to not provide the code in a jar form rather than just a plain spark scala shell script.

My application would focuses on reading kafka streams to spark. But since Kafka stream is not part of the spark, I would need to supply the kafka stream dependency jar while submitting to spark-submit.

I have written the Spark script in IntellijIDEA IDE. I am also having some other dependencies which are visible under the "external Libraries" section of the IDE.

Now how can I create a single dependency jar out of the "external libraries" so that I can supply it while submitting my spark script?

user3243499
  • 2,953
  • 6
  • 33
  • 75
  • Possible duplicate of [Compiling Spark Scala Program into jar file using installed spark and maven](https://stackoverflow.com/questions/37930875/compiling-spark-scala-program-into-jar-file-using-installed-spark-and-maven) – philantrovert Jun 07 '18 at 08:30
  • Also: https://stackoverflow.com/questions/16222748/building-a-fat-jar-using-maven – philantrovert Jun 07 '18 at 08:31
  • @philantrovert My question is not a duplicate of the links which you have suggested. I am NOT asking "how to create a package jar of my application?" rather, I am asking "how to create a common dependency jar which does not include my Spark Script?" so that I can submit my script to Spark using command "spark-submit --jar ". If you find any duplicates for this question please let me know. I would be thankful. – user3243499 Jun 07 '18 at 09:14

0 Answers0