I have a arbitrary-length Array[String] like:
val strs = Array[String]("id","value","group","ts")
How can I transfer it to DataFrame looks like:
+-----+------+-------+----+
|_0 | _1 | _2 | _3 |
+-----+------+-------+----+
| id| value| group | ts |
The solutions I tried:
code:
spark.sparkContext.parallelize(List((strs.toList))).toDF().show()
or
spark.sparkContext.parallelize(List(strs)).toDF().show()
result:
+--------------------+
| value|
+--------------------+
|[id, value, group...|
+--------------------+
code:
spark.sparkContext.parallelize(strs).toDF().show()
result:
+-----+
|value|
+-----+
| id|
|value|
|group|
| ts|
+-----+
Not really I want.
I know the solution as:
val data1 = List(
(1,"A","X",1),
(2,"B","X",2),
(3,"C",null,3),
(3,"D","C",3),
(4,"E","D",3)
).toDF("id","value","group","ts").show()
But in my case, the Array[String] is arbitrary-length