I'm creating a dataframe in spark like this:
DF: DataFrame = df.sqlContext.createDataFrame(myTypeRDD, getMyTypeSchema())
MyType
is a complex data type.
For testing purposes I want to use it as a MyType
collection.
My attempts:
- Trying to cast on the dataFrame:
DF.as[MyType]
& DF.map(row => row.asInstanceOf[MyType]
gave:
No Encoder found for org.joda.time.DateTime
- Trying to cast after collecting:
DF.collect().asInstanceOf[MyType]
gave:
org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema cannot be cast to MyType
Any suggestions would be appreciated