I have a Dataframe with a MapType field.
>>> from pyspark.sql.functions import *
>>> from pyspark.sql.types import *
>>> fields = StructType([
... StructField('timestamp', TimestampType(), True),
... StructField('other_field', StringType(), True),
... StructField('payload', MapType(
... keyType=StringType(),
... valueType=StringType()),
... True), ])
>>> import datetime
>>> rdd = sc.parallelize([[datetime.datetime.now(), 'this should be in', {'akey': 'aValue'}]])
>>> df = rdd.toDF(fields)
>>> df.show()
+--------------------+-----------------+-------------------+
| timestamp| other_field| payload|
+--------------------+-----------------+-------------------+
|2018-01-10 12:56:...|this should be in|Map(akey -> aValue)|
+--------------------+-----------------+-------------------+
I would like to add the other_field
as a key in the payload
field.
I know I can use a udf:
>>> def _add_to_map(name, value, map_field):
... map_field[name] = value
... return map_field
...
>>> add_to_map = udf(_add_to_map, MapType(StringType(),StringType()))
>>> df.select(add_to_map(lit('other_field'), 'other_field', 'payload')).show(1, False)
+------------------------------------------------------+
|PythonUDF#_add_to_map(other_field,other_field,payload)|
+------------------------------------------------------+
|Map(other_field -> this should be in, akey -> aValue) |
+------------------------------------------------------+
Is there a way to do this without a udf
?