11

I have a spark dataframe looks like:

id   DataArray
a    array(3,2,1)
b    array(4,2,1)     
c    array(8,6,1)
d    array(8,2,4)

I want to transform this dataframe into:

id  col1  col2  col3
a    3     2     1
b    4     2     1
c    8     6     1 
d    8     2     4

What function should I use?

lserlohn
  • 5,878
  • 10
  • 34
  • 52

2 Answers2

22

Use apply:

import org.apache.spark.sql.functions.col

df.select(
  col("id") +: (0 until 3).map(i => col("DataArray")(i).alias(s"col$i")): _*
)
Vincent Doba
  • 4,343
  • 3
  • 22
  • 42
user9554572
  • 236
  • 2
  • 2
  • Am not able to resolve import org.apache.spark.sql.col , may i know which version of spark are you using. Do we need any additional packages ? import org.apache.spark.sql.col :23: error: object col is not a member of package org.apache.spark.sql – nkkrishnak Jun 23 '19 at 06:59
4

You can use foldLeft to add each columnn fron DataArray

make a list of column names that you want to add

val columns = List("col1", "col2", "col3")

columns.zipWithIndex.foldLeft(df) {
  (memodDF, column) => {
    memodDF.withColumn(column._1, col("dataArray")(column._2))
  }
}
  .drop("DataArray")

Hope this helps!

koiralo
  • 22,594
  • 6
  • 51
  • 72