0

I am trying to enable dynamic partition in my local spark session (not in application mode)

I'm running below commands in my pyspark shell (using spark 2.4)

spark.sqlContext.setConf("hive.exec.dynamic.partition", "true") spark.sqlContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")

getting below error AttributeError: 'SparkSession' object has no attribute 'sqlContext'

Rahul
  • 467
  • 1
  • 8
  • 24
  • seems to be duplicate of https://stackoverflow.com/questions/58633753/ignoring-non-spark-config-property-hive-exec-dynamic-partition-mode? – Som May 28 '20 at 10:01

1 Answers1

2

Can you try to get context as

from pyspark.sql import SQLContext
sqlContext = SQLContext(spark.sparkContext)
sqlContext.setConf("hive.exec.dynamic.partition", "true") 
sqlContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict") 
QuickSilver
  • 3,915
  • 2
  • 13
  • 29