-3

How do you typically fix the "java.io.Serializable" error below?

I am guessing the data types in my functions caused it (?). How do you avoid it OR change the result back to the right type.

def allKeys(sampledf: DataFrame): DataFrame = {......}

val afd12= afd.schema.fieldNames.contains("ID") && afd.schema.fieldNames.contains("CONNECTIDS") match {
   case true => allKeys(afd) 
   case false => "no"
}

afd12.printSchema()

This is the error I get:

afd: java.io.Serializable = [ID: string, ADDITIONALINFO: string ... 87 more fields]
<console>:95: error: value printSchema is not a member of java.io.Serializable
   afd12.printSchema()
         ^
touelv
  • 73
  • 1
  • 7

1 Answers1

4

You have to make sure that pattern match block

 match {
   case true => allKeys(afd) 
   case false => "no"
}

returns consistent types. Right now one branch returns Dataset[Row] and another String, so the closest common type is Serializable. The simplest fix is to return an empty DataFrame with schema of your choice, instead of no.

match {
  case true => allKeys(afd) 
  case _ => spark.emptyDataFrame
}