1

I have a scala app which uses an external jar lib. How can I use it if the application jar was copied to hdfs?

From local I was start with --conf spark.driver.extraClassPath=./lib/* but if I use hdfs link it will not work.

Jason Aller
  • 3,541
  • 28
  • 38
  • 38
  • Why don't you just build a fat jar using maven https://stackoverflow.com/questions/16222748/building-a-fat-jar-using-maven ? The same thing is available for gradle or other build tool – jojo_Berlin Aug 03 '17 at 16:29
  • @jojo_Berlin that good case, but I not need all libs. – user5730669 Aug 04 '17 at 02:19
  • @user5730669: check my [answer here](https://stackoverflow.com/a/40797172/647053) – Ram Ghadiyaram Aug 22 '17 at 19:34
  • Possible duplicate of [how to append a resource jar for spark-submit?](https://stackoverflow.com/questions/40796818/how-to-append-a-resource-jar-for-spark-submit) – Ram Ghadiyaram Aug 22 '17 at 19:34
  • @jojo_Berlin I was created fat jar. But it not good issue because in result I had big jar file, wich will copy on all nodes when jar is running. After developing application I will create oozie and upload all lib with oozie from external lib on HDFS. – user5730669 Aug 23 '17 at 11:11

0 Answers0