-1

I am trying to run a spark job but getting below error.

21/12/24 15:40:43 ERROR SparkContext: Error initializing SparkContext.
java.lang.RuntimeException: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE
    at org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:131)
    at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:118)
    at org.apache.spark.network.server.TransportServer.init(TransportServer.java:95)

Here is the netty dependencies which are being used:

netty-3.7.0.Final.jar netty-all-4.0.43.Final.jar
netty-buffer-4.1.69.Final.jar netty-codec-4.1.69.Final.jar
netty-codec-http-4.1.69.Final.jar netty-codec-socks-4.1.60.Final.jar
netty-common-4.1.69.Final.jar netty-handler-4.1.69.Final.jar
netty-handler-proxy-4.1.60.Final.jar netty-resolver-4.1.69.Final.jar
netty-transport-4.1.69.Final.jar
netty-transport-native-epoll-4.1.69.Final.jar
netty-transport-native-epoll-4.1.60.Final-linux-x86_64.jar
netty-transport-native-kqueue-4.1.69.Final.jar
netty-transport-native-kqueue-4.1.60.Final-osx-x86_64.jar
netty-transport-native-unix-common-4.1.69.Final.jar

I have tried with netty-all version 4.0.43 also but somehow i get same error. spark version used: 2.2.3 Can anyone please help me why this issue is coming.

user3274140
  • 123
  • 3
  • 13

1 Answers1

0

Ensure you force the same netty version for everything. You have multiple versions on the Classpath. Just use 4.1.72.Final

Norman Maurer
  • 23,104
  • 2
  • 33
  • 31