I have submitted my spark job as mentioned here bin/spark-submit --class DataSet BasicSparkJob-assembly-1.0.jar
without mentioning the --master
parameter or spark.master
parameter. Instead of that job gets submitted to my 3 node spark cluster. But i was wondering where it submitted the job because it is not showing any information in the Running Applications
Asked
Active
Viewed 7,457 times
6

Naresh
- 5,073
- 12
- 67
- 124
-
2If you have set "conf.setMaster("local[X]")" inside your application, it will always run locally even if you submit it to "--master URL" – Gakuo Jan 27 '19 at 07:51
2 Answers
5
If you do not set the master in --master
nor spark.master
Spark will run locally.
You could still view the progress of your job. By default the UI will be availalbe during the running of your spark job on http://localhost:4040
.
When your job finishes, this UI will be killed and you could not view the history of your application unless you configured Spark history server

user1314742
- 2,865
- 3
- 28
- 34
-
Does it use the same configuration mentioned in the spark-default.conf for the memory and core for processing? – Naresh Jul 12 '16 at 10:57
-
if you mean the spar history server, it uses `spark-defaults.conf`as its configuration file, but the history server itself is not a Spark process, meaning that it is not started as Spark with driver/executors – user1314742 Jul 12 '16 at 11:37
2
It's likely that Spark is running your in local mode on your development machine.

Kien Truong
- 11,179
- 2
- 30
- 36
-
Ok, Is there a way where i can see the logs of the this running application then? – Naresh Jul 12 '16 at 08:43
-