1

For some reason, I want to create as many partitions as the number of executors * number of tasks. This is what I'll pass to the second parameter of sc.parallelize. Now my question is, can I programmatically get the number of executors and number of tasks per executor in Spark.

Cœur
  • 37,241
  • 25
  • 195
  • 267
MetallicPriest
  • 29,191
  • 52
  • 200
  • 356

0 Answers0