I need to add multiple jars to my spark-shell. So, I added them using --jars. Suddenly, after some time, I realize I need to add one more and I'm in between a job on which I already spent 2 hrs. So, is there any way to add jar at this point of time. I hope there should be something or else this issue can cause a big trouble.
Asked
Active
Viewed 35 times
1
-
What is your usecase ,why you need to add many jars ? – vaquar khan Jul 01 '18 at 18:16
-
2sc.addJar("/path/to/your/jar") https://stackoverflow.com/questions/42964131/sparkcontext-addjar-does-not-work-in-local-mode – gasparms Jul 01 '18 at 18:17
-
gasprams' answer should work. You could also save your intermediate result and restart your shell. – Oli Jul 01 '18 at 19:43
-
yes it's working! – arctic_Oak Jul 02 '18 at 08:34