0

According to this answer, python instances are set up when foreachPartition, mapPartitions functions are used in the nodes where the executors run. How are the memory / compute capacities of these instances set? Do we get to control it through some configuration?

justlikethat
  • 329
  • 2
  • 12
  • Like all other [configuration properties](https://spark.apache.org/docs/latest/configuration.html#application-properties)? – mazaneicha Feb 13 '23 at 18:07
  • @mazaneicha, The closest property in the list that comes to mentioning this is ```spark.executor.pyspark.memory``` but even here, in the description, it does not explicitly mention if this is for the python instances created by the executor. Also, is there a similar parameter for cores? – justlikethat Feb 13 '23 at 18:39
  • python will be able to use all `spark.executor.cores` cores. – mazaneicha Feb 13 '23 at 18:49

0 Answers0