I tryed to use this question to convert rdd object to dataframe in spark. The class in my use case contains more than 100 arguments (columns)
case class MyClass(val1: String, ..., val104: String )
val df = rdd.map({
case Row(val1: String, ..., val104: String) => MyClass(val1, ..., val104)
}).toDF("col1_name", ..., "col104_name")
I got this Error: too many arguments for unapply pattern, maximum = 22
May someone help me with a concrete example; i'm using spark 1.6 with scala . Thank you