Here is my JSON
[
{"string":"string1","int":1,"array":[1,2,3],"dict": {"key": "value1"}},
{"string":"string2","int":2,"array":[2,4,6],"dict": {"key": "value2"}}
]
Here is my parse code:
val mdf = sparkSession.read.option("multiline", "true").json("multi2.json")
mdf.show(false)
This outputs:
+---------------+---------+--------+----+-------+
|_corrupt_record|array |dict |int |string |
+---------------+---------+--------+----+-------+
|[ |null |null |null|null |
|null |[1, 2, 3]|[value1]|1 |string1|
|null |[2, 4, 6]|[value2]|2 |string2|
|] |null |null |null|null |
+---------------+---------+--------+----+-------+
Why do I have a _corrupt_record, everything looks ok? Why does the dict column only give the values and not the keys?
Thanks