2

I have a spark data frame of the format org.apache.spark.sql.DataFrame = [user_key: string, field1: string]. When I use saveAsTextFile to save the file in hdfs results look like [12345,xxxxx]. I don't want the opening and closing bracket written to output file. if i used .rdd to convert into a RDD still the brackets are present in the RDD.

Thanks

Naveenan
  • 345
  • 1
  • 4
  • 16

1 Answers1

9

Just concatenate the values and store strings:

import org.apache.spark.sql.functions.{concat_ws, col}
import org.apache.spark.sql.Row

val expr = concat_ws(",", df.columns.map(col): _*)
df.select(expr).map(_.getString(0)).saveAsTextFile("some_path")

Or even better use spark-csv:

selectedData.write
  .format("com.databricks.spark.csv")
  .option("header", "false")
  .save("some_path")

Another approach is to simply map:

df.rdd.map(_.toSeq.map(_.toString).mkString(","))

and save afterwards.

zero323
  • 322,348
  • 103
  • 959
  • 935
  • Thanks for response. In production we have version 1.4.2 and concat_ws is not part of this version. But I was able to select the two columns and only one column is accessible. When I use map(p=>p(0),p(1)) get an error. Thanks – Naveenan Dec 18 '15 at 19:49