The documentation on spark-submit says the following:
The spark-submit script in Spark’s bin directory is used to launch applications on a cluster.
Regarding the pyspark it says the following:
You can also use bin/pyspark to launch an interactive Python shell.
This question may sound stupid, but when i am running the commands though pyspark
they also run on the "cluster", right? They do not run on the master node only, right?