Using rdd.map(lambda x: .....) in pyspark I need to write a lambda-function that is supposed to format a string.
For example I have a string "abcdefgh" and in each row of a column after each two symbols I want to insert "-" in order to get "ab-cd-ef-gh".
How could I implement it using the code like this with right pyspark-syntaxis:
df.rdd.map(lambda x: ((for i in range(10): x[i+2:2] + "-"),)).toDF()