I have list of columns in a list need to add withcolumn like similary how we do it in scala like below:
list.foldLeft(df){(tempDF, listValue) => tempDF.withColumn(listValue._1, listValue._2) }
Let me clarify the question. I have a dataframe with struct of array types I need to flatten each and every array using explode eg: df.withcolumn(col1,explode(col1)) if need to create a generic function to pass all column in one go how do u do it? if I use for loop it is doubling the data for every explode statement
How can you do it in pyspark
?