I am trying to overwrite the spark session/spark context default configs, but it is picking entire node/cluster resource.
spark = SparkSession.builder
.master("ip")
.enableHiveSupport()
.getOrCreate()
spark.conf.set("spark.executor.memory", '8g')
spark.conf.set('spark.executor.cores', '3')
spark.conf.set('spark.cores.max', '3')
spark.conf.set("spark.driver.memory",'8g')
sc = spark.sparkContext
It works fine when i put the configuration in spark submit
spark-submit --master ip --executor-cores=3 --diver 10G code.py