How to store a Pyspark DataFrame object to a hive table , "primary12345" is a hive table ?
am using the below code masterDataDf
is a data frame object
masterDataDf.write.saveAsTable("default.primary12345")
getting below error
: java.lang.RuntimeException: Tables created with SQLContext must be TEMPORARY. Use a HiveContext instead.