I am trying to run a Spark application on AWS EMR. I see that the driver is always shared with the executors. Is there a way to avoid that so that driver gets the host to itself?
I was able to do that in a way by increasing the memory to a big number. But given that the host type is not fixed, I have to keep changing that setting. Is there a way to indicate to use all that is available without specifying a positive integer value? I tried setting spark.driver.memory to 0, but it does not like 0. Thanks.