I'm totally new at this, so I don't understand really well how it's doing. I need to run spark on my machine (login with ssh) and set up memory 60g, and 6 cores for execution. This is what I've tried.
spark-submit --master yarn --deploy-mode cluster --executor-memory 60g --executor-cores 6
And this is what I got:
SPARK_MAJOR_VERSION is set to 2, using Spark2
Exception in thread "main" java.lang.IllegalArgumentException: Missing application resource.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:253)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildSparkSubmitArgs(SparkSubmitCommandBuilder.java:160)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildSparkSubmitCommand(SparkSubmitCommandBuilder.java:276)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:151)
at org.apache.spark.launcher.Main.main(Main.java:87)
So, I guess there is some things to add to this code line for running and I have no idea what.