I have a Java Map
variable, say Map<String, String> singleColMap
. I want to add this Map
variable to a dataset as a new column value in Spark 2.2 (Java 1.8).
I tried the below code but it is not working:
ds.withColumn("cMap", lit(singleColMap).cast(MapType(StringType, StringType)))
Can some one help on this?