2

I am currently working on Spark and trying to suggest an adaptive execution plan. However, I am wondering whether it is possible to modify the parameters of the Spark engine at runtime. For example, Can I use different compression codecs for two separate stages, or can I modify the memory fractions reserved for shuffling and computation at runtime? Say for the map phase, I diminish the memory fraction allocated for shuffling, to increase it later when the shuffling occurs?

Thanks

YACINE GACI
  • 145
  • 2
  • 13

1 Answers1

2

It is not possible in general.

While a subset of configuration options can be changed on runtime using (Customize SparkContext using sparkConf.set(..) when using spark-shell) RuntimeConfig object, core options, cannot be modified unless SparkContext is restarted.