0

I was reading that there should be the possibility to run multiple executors on a worker instance on spark-standalone.

https://github.com/apache/spark/pull/731

But it looks like it is not merged, yet. Anyone can confirm? In case to speed up processing I can still use more executors setting:

spark_worker_instances

Is that all I need to do?

Enrico D' Urso
  • 196
  • 2
  • 9

0 Answers0