I want to add GeoSpark library to Apache Spark. How do I add GeoSpark library from Spark shell?
Asked
Active
Viewed 862 times
0
-
http://spark.apache.org/docs/latest/programming-guide.html#using-the-shell – eliasah Dec 19 '15 at 10:32
-
There is a good thread about the same topic here: https://stackoverflow.com/questions/37132559/add-jars-to-a-spark-job-spark-submit – Reza Jul 24 '17 at 12:40
1 Answers
3
$ ./bin/spark-shell --master local[4] --jars code.jar
--jars option will distribute your local custom jar to cluster automatically.

Shawn Guo
- 3,169
- 3
- 21
- 28