0

I am running a jar on spark slave with version spark-2.5.6-bin-hadoop where i am getting this error on submitting the jar

Exception occurred while create new JwtSource
java.lang.NoClassDefFoundError: Could not initialize class io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder
    at io.spiffe.workloadapi.internal.GrpcManagedChannelFactory.createNativeSocketChannel(GrpcManagedChannelFactory.java:55)
    at io.spiffe.workloadapi.internal.GrpcManagedChannelFactory.newChannel(GrpcManagedChannelFactory.java:41)
    at io.spiffe.workloadapi.DefaultWorkloadApiClient.newClient(DefaultWorkloadApiClient.java:133)
    at io.spiffe.workloadapi.DefaultJwtSource.createClient(DefaultJwtSource.java:221)
    at io.spiffe.workloadapi.DefaultJwtSource.newSource(DefaultJwtSource.java:90)

The spiffe dependency i am using is this

        <dependency>
            <groupId>io.spiffe</groupId>
            <artifactId>java-spiffe-provider</artifactId>
            <version>0.6.3</version>
        </dependency>
        <dependency>
            <groupId>io.spiffe</groupId>
            <artifactId>java-spiffe-core</artifactId>
            <version>0.6.3</version>
        </dependency>

From whatever solution i could find online , it seemed the issue is from guava version where the jar i am deploying is build from guava 29.0-jre while spark slave is picking the guava-14.0jar from /opt/spark-2.4.6-bin-hadoop2.7/jars

please show me how to resolve these dependency conflict issues.

anand
  • 1

1 Answers1

0

The general approach is to take the older library and update it, possibly updating any of the other items that required the older library to the newer versions that require the newer library.

In your case, you should probably update spark, guava, and possibly a bit more.

Edwin Buck
  • 69,361
  • 7
  • 100
  • 138
  • yeah the problem was because of version mismatch for guava dependency in class path of spark, resolved it with maven shading – anand Aug 09 '23 at 08:16