3

I have a 3 node spark standalone cluster and on the master node I also have a worker. When I submit a app to the cluster the two other workers start RUNNING, but the worker on the master node stay with status LOADING and eventually another worker is launched on one of the other machines.

Is having a worker and a master on the same node being the problem ? If yes, is there a way to workout this problem or I should never have a worker and a master on the same node ?

P.S. The machines have 8 cores each and the workers are set to use 7 and not all of the RAM

vntzy
  • 113
  • 1
  • 2
  • 6

2 Answers2

2

Yes, you can, here is from Spark web doc:

In addition to running on the Mesos or YARN cluster managers, Spark also provides a simple standalone deploy mode. You can launch a standalone cluster either manually, by starting a master and workers by hand, or use our provided launch scripts. It is also possible to run these daemons on a single machine for testing.

Kehe CAI
  • 1,161
  • 12
  • 18
1

It is possible to have a machine hosting both Workers and a Master.

Is it possible that you misconfigured the spark-env.sh on that specific machine?

imriqwe
  • 1,455
  • 11
  • 15