I am trying to enable dynamic partition in my local spark session (not in application mode)
I'm running below commands in my pyspark shell (using spark 2.4)
spark.sqlContext.setConf("hive.exec.dynamic.partition", "true") spark.sqlContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")
getting below error AttributeError: 'SparkSession' object has no attribute 'sqlContext'