0

Spark job is stopped due to following error and starts only after multiple attempts:

Caused by: java.io.InvalidClassException: org.apache.hadoop.hbase.spark.HBaseContext; local class incompatible: stream classdesc serialVersionUID = -5686505108908438419, local class serialVersionUID = -6879194698097628128
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1843)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1713)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)

Somebody please help.

Rushi Pradhan
  • 21
  • 1
  • 7
  • read this stackoverflow [answer](https://stackoverflow.com/questions/8335813/java-serialization-java-io-invalidclassexception-local-class-incompatible) – jose praveen Oct 06 '17 at 04:15
  • The other jar that we were using which was provide by other team was developed on a different version of CDH. After they updated their CDH version and rebuilt the jar then this issue got resolved. – Rushi Pradhan Oct 06 '17 at 13:01

0 Answers0