This is on scala 2.12.10 and spark 2.4.8. I am trying to define a trait as follows with a method that can convert an array of some case class to be converted to a dataframe. The type parameter is meant to be some schema (the case class) that extends QuestionSchema hence T <: schemas.QuestionSchema
. I try to import spark implicits so that I can convert the data to dataframe after converting to sequence but it does not seem to to work, can anyone see what is wrong here and another way to do this?
trait DataStore {
var data1 = Array.empty[Data1Type]
var data2 = Array.empty[Data2Type]
def convertToDf[T <: schemas.QuestionSchema](res: Array[T])(implicit spark: SparkSession): DataFrame = {
import spark.implicits._
res.toSeq.toDF() // value toDF is not a member of Seq[T]
}
}