2

I have a column in a Spark dataframe where the schema looks something like this:

|-- seg: map (nullable = false)
 |    |-- key: string
 |    |-- value: array (valueContainsNull = false)
 |    |    |-- element: struct (containsNull = false)
 |    |    |    |-- id: integer (nullable = false)
 |    |    |    |-- expiry: long (nullable = false)

The value in the column looks something like this:

Map(10000124 -> WrappedArray([20185255,1561507200], [20185256,1561507200]))]

What I want to do it create a column from this Map column which only contain an array of [20185255,20185256] (The elements of the array are 1st element of each array in the WrappedArray). How do I do this ?

I am trying not to use "explode".

** Also is their a way I can use a UDF which take in the Map and get those values ?**

zero323
  • 322,348
  • 103
  • 959
  • 935
cryp
  • 2,285
  • 3
  • 26
  • 33
  • 1
    Possible duplicate of [Querying Spark SQL DataFrame with complex types](http://stackoverflow.com/questions/28332494/querying-spark-sql-dataframe-with-complex-types) – zero323 Jul 06 '16 at 15:54

0 Answers0