0

I am new to Scala in Spark.

I have a input dataframe with one column.

Each element in the dataframe is List of Lists which I need to convert to Dataframe.

def functionName(x: DataFrame){
   //CODE TO DO ON DATAFRAME
}

> inputdf.show()

+------------------------+
|                 col    |
+------------------------+
| [[a, b, c], [d, e, f]] |
| [[g, h, i], [j, k, l]] |
| [[m, n, o], [p, q, r]] |
| [[s, t, u], [v, w, x]] |
+------------------------+

To convert each row to a dataframe I have used:

> inputdf.rdd.map(row => functionName(row.toDF()))

> inputdf.rdd.map(row => functionName(sqlContext.createDataFrame(sc.parallelize(row))))

> inputdf.rdd.map(row => functionName(sqlContext.createDataFrame(sc.parallelize(Seq(row)))))

Have tried most of the methods suggested on stackoverflow but none of them. Can some suggest me how to convert each row in the inputdf using map function? Thanks in advance.

  • Possible duplicate of [Convert scala list to DataFrame or DataSet](https://stackoverflow.com/questions/39397652/convert-scala-list-to-dataframe-or-dataset) – Pavel Oct 25 '17 at 11:50
  • Hi @Pavel, I have tried the solution for the other question. But it is not the same question. Thanks for suggestion anyway. – codebasefound Oct 25 '17 at 11:57

0 Answers0