0

When I do spark-submit with spark-csv, I use the following command

spark-submit --master spark://10.0.0.1:7077 --packages com.databricks:spark-csv_2.11:1.2.0  MyApp.jar

(note: I'm using --packages com.databricks:spark-csv_2.11:1.2.0)

The question is: How can I do that with spark launcher (I can't find where to put in the package information from the API)?

(Following is the code I'm using)

import org.apache.spark.launcher.SparkLauncher

object Launcher extends App {

  val spark = new SparkLauncher()
  .setSparkHome("/myspark1.5.1path/")
  .setAppResource("/mypath/MyApp.jar")
  .setMainClass("MyApp")
  .setMaster("local[*]")
  .launch()
  spark.waitFor()
}
Carson Pun
  • 1,742
  • 2
  • 13
  • 20
  • Apparently you can't (yet). https://spark.apache.org/docs/latest/api/java/org/apache/spark/launcher/SparkLauncher.html – Reactormonk Dec 15 '15 at 02:50
  • You can use maven shade plugin [link](http://stackoverflow.com/questions/32265456/how-to-pre-package-external-libraries-when-using-spark-on-a-mesos-cluster/32403191#32403191) – Kaushal Dec 15 '15 at 09:56

1 Answers1

2
spark.addSparkArg("--packages", "com.databricks:spark-csv_2.11:1.2.0")
propi
  • 126
  • 5