I need to externalize the Spark Configs in our job.conf files so that they can be read from an external location and modified only in that one external location to use at runtime.
Configs such as
spark.executor.memory
spark.executor.cores
spark.executor.instances
spark.sql.adaptive.enabled
spark.sql.legacy.timeParserPolicy
Would be stored in this file.
I am very new to this and am finding very limited resources on the web about handling this process. I've seen a couple of YouTubes about using a scala file to handle this. Any assistance would be greatly appreciated.
I have attempted to emulate the scala examples I have seen online, but don't know how to call the resulting file from Spark (or even if the scala is correct to begin with).