1

I have an application in java that uses spark and hbase. We need to hit a url deployed in tomcat(jersey). So, we have used resteasy client to do that.

When i execute a standalone java code to hit the url using rest-easy client, it works fine

However, when i use the same code in my another application that uses spark for some processing, then it throws the error as shown in the title. I am using maven as build tool in eclipse. After building it, i am creating a runnable jar and selecting the option "extract required libraries into generated jar". For executing the application i am using the command:

nohup spark-submit --master yarn-client myWork.jar myProperties 0 &

The dependency for rest-easy client code:

<dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
    </dependency>
    <dependency>
    <groupId>org.jboss.resteasy</groupId>
    <artifactId>resteasy-client</artifactId>
    <version>3.0.11.Final</version>    
</dependency>
  </dependencies>

I am unable to figure out that during compile time , it does not throw any error, but during runtime, although the jar has each and every library packed in(including that of spark and hbase), it throws error saying no such method. Please help.

edge
  • 257
  • 4
  • 20
  • Which version of spark are you using? – VladoDemcak Jan 07 '17 at 12:54
  • 1
    This is clearly a version mismatch error. This answer may help: http://stackoverflow.com/questions/24139097/resteasy-client-nosuchmethoderror – Chobeat Jan 07 '17 at 17:57
  • spark verison = 1.4.1 – edge Jan 09 '17 at 04:44
  • i have tried changing the version of resteasy-client but it didn't help. during compile time i can see the class, how come at runtime it is missing, even when i have packed all the libraries in it.? – edge Jan 09 '17 at 05:41

1 Answers1

1

have tried changing the version of resteasy-client but it didn't help. during compile time i can see the class, how come at runtime it is missing

Possible reasons could be reasons

1) If you are using maven scope might be provided. so that your jar wont be copied to your distribution.

This is ruled out by above configuration you have mentioned.

2) You are not pointing to correct location from your execution script may be shell script.

3) Your are not passing this jar with --jars option or --driverclasspath --executorclasspath etc...

I doubt issue is because of second or third reasons.

Also have a look at https://spark.apache.org/docs/1.4.1/submitting-applications.html

EDIT :

Question : spark-submit --conf spark.driver.extraClassPath=surfer/javax.ws.rs-api-2.0.1.jar:surfer/jersey-client-2.25.jar:surfer/jersey-common-2.25.jar:surfer/hk2-api-2.5.0-b30.jar:surfer/jersey-guava-2.25.jar:surfer/hk2-utils-2.5.0-b30.jar:surfer/hk2-locator-2.5.0-b30.jar:surfer/javax.annotation-api-1.2.jar artifact.jar againHere.csv

now it throws different exception : Exception in thread "main" java.lang.AbstractMethodError: javax.ws.rs.core.UriBuilder.uri(Ljava/lang/String;)Ljavax/ws/rs/core/UriBuilder; i have also tried searching for the class Response$Status$Family somewhere in classpath other than what i am supplying. i used the command grep Response$Status$Family.class /opt/mapr/spark/spark-1.4.1/lib/*.jar And i found that spark also has this class. May be this is the issue. but how to forcefully tell the jvm to use the class supplied by me at runtime and not that of spark, i don't know! can you help?

Since you provided external jar in the classpath

You can use below options to tell framework that it has to use external jar provided by you. This can be done in 2 ways

  1. through spark submit
  2. conf.set...

Since you are using 1.4.1 see configuration options

spark.executor.userClassPathFirst false (Experimental) Same functionality as spark.driver.userClassPathFirst, but applied to executor instances.

spark.driver.userClassPathFirst false (Experimental) Whether to give user-added jars precedence over Spark's own jars when loading classes in the the driver. This feature can be used to mitigate conflicts between Spark's dependencies and user dependencies. It is currently an experimental feature. This is used in cluster mode only. can be used to to tell framework

Ram Ghadiyaram
  • 28,239
  • 13
  • 95
  • 121
  • I am using maven as build tool in eclipse. After building it, i am creating a runnable jar and selecting the option "extract required libraries into generated jar". Inside the application jar, i can see the class 'Response' and readEntity method as well. – edge Jan 09 '17 at 09:08
  • ok how are you submitting job then can you print submit command ? I cant see --jars option here I guess its taking old jar file in which that method not present. – Ram Ghadiyaram Jan 09 '17 at 09:21
  • your `myWork.jar` is uber jar ? I cant see --jars option in above question. – Ram Ghadiyaram Jan 09 '17 at 09:24
  • yes, i have not supplied --jars option because i have extracted the resteasy library into the jar itself. – edge Jan 09 '17 at 09:32
  • i have also tried using --jars option. it works for some other libraries like poi.jar which i supply externally. but it does not work for javax.ws.rs.core. – edge Jan 09 '17 at 09:34
  • which version of java you are using ? I guess this class is bundled with java src.zip itself. so it is taking old version of jar isnt it ? – Ram Ghadiyaram Jan 09 '17 at 10:01
  • java version "1.7.0_80". – edge Jan 09 '17 at 10:27
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/132785/discussion-between-edge-and-ram-ghadiyaram). – edge Jan 10 '17 at 11:33
  • since it is experimental, i can't use it. i will find some other workaround. i am taking out that rest part out of this spark application and as it works in separate java program, i will invoke it as a separate call to the program. – edge Jan 11 '17 at 09:09