0

I have a custom object that I want to use in spark.

var myTuple : Seq[(String, MyCustomObject)] = Seq()


class MyCustomObject(var x: String) extends Serializable {

  override def toString = s"$x"

  @throws[IOException]
  private def writeObject(out: ObjectOutputStream): Unit = {
    out.writeObject(x)
  }

  @throws[IOException]
  @throws[ClassNotFoundException]
  private def readObject(in: ObjectInputStream): Unit = {
    var myString = ""
    in.readObject(myString)
    x = myString;
  }
}

On the above, I defined a custom object class that extends the serializable, but I still get the following error

java.lang.UnsupportedOperationException: No Encoder found for MyCustomObject

Can someone point me a direction?

Edit:

I've look at the duplicate but still have trouble getting the following without error

var myTuple : Seq[Seq((String, MyCustomObject))] = Seq()

I've basically added

implicit def single[A](implicit c: ClassTag[A]): Encoder[A] = Encoders.kryo[A](c)

implicit def tuple2[A1, A2](
  implicit e1: Encoder[A1],
           e2: Encoder[A2]
): Encoder[(A1,A2)] = Encoders.tuple[A1,A2](e1, e2)
user10714010
  • 865
  • 2
  • 13
  • 20

0 Answers0