I'm trying to transform a Spark dataframe into a Scalar map and additionally a list of values.
It is best illustrated as follows:
val df = sqlContext.read.json("examples/src/main/resources/people.json")
df.show()
+----+-------+
| age| name|
+----+-------+
|null|Michael|
| 30| Andy|
| 19| Justin|
| 21|Michael|
+----+-------+
To a Scala collection (Map of Maps(List(values))) represented like this:
Map(
(0 -> List(Map("age" -> null, "name" -> "Michael"), Map("age" -> 21, "name" -> "Michael"))),
(1 -> Map("age" -> 30, "name" -> "Andy")),
(2 -> Map("age" -> 19, "name" -> "Justin"))
)
As I don't know much about Scala, I wonder if this method is possible. It doesn't matter if it's not necessarily a List.