I have a PySpark code running in a Glue job. The job takes an argument called 'update_mode'. I want to set different configuration for spark depending on the update_mode is full_overwrite vs upsert. Specifically, I want to switch this spark config spark.sql.sources.partitionOverwriteMode
between static vs dynamic.
I tried creating two spark sessions and using the respective spark object but it doesn't behave as expected. The other option I can think of is just creating two separate jobs with different configurations.
Any other ideas to do it in the same job?