0

I have a dataframe

import os, sys
import json, time, random, string, requests
import pyodbc 
from pyspark import SparkConf, SparkContext, SQLContext
from pyspark.sql.functions import explode, col, from_json, lit
from pyspark.sql import functions as f
from pyspark.sql import SparkSession
from pyspark.sql.types import *
...
df = data.withColumn("dev_serial", col("data.dev_serial")) \
   .withColumn("dev_property", from_json(col("data.dev_property"), MapType(StringType(), StringType())) )\
   .drop("data")
df.show(truncate=False)
df.printSchema()

and This result is here

enter image description here

I want to explode dev_property(column)

dev_serial / use_event / item / ...
value1 / value2 / value3 /value4
.
.
.

How to explode?

Pranav Hosangadi
  • 23,755
  • 7
  • 44
  • 70
KIMJAEMIN
  • 53
  • 1
  • 1
  • 7
  • Does this answer your question? [How to show full column content in a Spark Dataframe?](https://stackoverflow.com/questions/33742895/how-to-show-full-column-content-in-a-spark-dataframe) – Cribber Oct 06 '20 at 13:07

1 Answers1

3

As you want to explode the dev_property column into two columns, this script would be helpful:

df2 = df.select(df.dev_serial, explode(df.dev_property))
df2.printSchema()
df2.show()

Read more about how explode works on Array and Map types.

amsh
  • 3,097
  • 2
  • 12
  • 26
  • 1
    While code-only answers might answer the question, you could significantly improve the quality of your answer by providing context for your code, a reason for why this code works, and some references to documentation for further reading. From [answer]: _"Brevity is acceptable, but fuller explanations are better."_ – Pranav Hosangadi Oct 06 '20 at 19:26
  • Thanks, but This results is not complete me. I want to this columns : dev_serial / key1 / key 2 / key3 ... values : 1401 / value1 / value2 / value3 but your solution's result is columns : dev_serial / key / value values : 1401 / key1 / value1 ... Are you have another solution ? – KIMJAEMIN Oct 06 '20 at 21:53
  • Thanks to the site you told me, I solved the problem. Thank you!!! – KIMJAEMIN Oct 06 '20 at 22:56
  • @KIMJAEMIN, good to know. You can add the solution that worked for you. – amsh Oct 07 '20 at 02:16