1

For a project requirement I was trying to build FlumUtils example with spark present in spark examples. I was able to create the jar file. But while trying to execute it I am getting the following error. Can anybody help me in resolving this?

Error: application failed with exception
java.lang.NoClassDefFoundError: org/apache/spark/streaming/flume/FlumeUtils
        at SimpleApp.main(SimpleApp.java:61)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:367)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:77)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.flume.FlumeUtils
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)




  I have included the below dependency as pom file



<dependency> <!-- Spark Flume dependency -->
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-streaming-flume_2.10</artifactId>
  <version>1.2.1</version>
</dependency>
<dependency> <!-- Spark Core dependency -->
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.10</artifactId>
  <version>1.2.1</version>
</dependency>
<dependency> <!-- Spark Steaming dependency -->
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-streaming_2.10</artifactId>
  <version>1.2.1</version>

And I am running the same with spark-submit --class SimpleApp target/simple-project-1.0.jar

Can any one help me for the same?

Prashant Agrawal
  • 381
  • 3
  • 14
  • is your simple-project-1.0.jar contains Flume API inside it (i mean is your jar packaging is Uber/fat jar) – vijay kumar Jul 31 '15 at 13:17
  • How can I check if simple-project-1.0.jar contains Flume API or not as I have imported the same in our code and included mentioned dependency in post like spark-streaming-flume_2.10, spark-core_2.10, spark-streaming_2.10 – Prashant Agrawal Aug 03 '15 at 05:43
  • you can check by executing 'jar -tvf jar-name.jar' . note that adding dependency doesnt mean that they shoud be in the simple-proj-1.0.jar. for more details refer http://stackoverflow.com/questions/1729054/including-dependencies-in-a-jar-with-maven – vijay kumar Aug 03 '15 at 06:35
  • While running the above command I get the output as META-INF/ META-INF/MANIFEST.MF SimpleApp.class SimpleApp$1.class META-INF/maven/ META-INF/maven/org.apache.spark/ META-INF/maven/org.apache.spark/simple-project/ META-INF/maven/org.apache.spark/simple-project/pom.xml META-INF/maven/org.apache.spark/simple-project/pom.properties Which seems flume dependencies are not there so how I can include those dependency while building the same – Prashant Agrawal Aug 04 '15 at 07:18
  • refer this example http://www.mkyong.com/maven/create-a-fat-jar-file-maven-shade-plugin/ – vijay kumar Aug 04 '15 at 12:41

2 Answers2

1

If you are running the job using spark-sibmit.. make sure you are providing spark-streaming_2.10 jar using --jars option

  • I am getting the same error when i ran the same with --jars option as mentioned:spark-submit --jars ../external/spark-native-yarn/lib/spark-streaming_2.10-1.2.1.2.2.6.0-2800.jar --class "SimpleApp" target/simple-project-1.0.jar – Prashant Agrawal Aug 03 '15 at 05:43
  • Why do you expect to find FlumeUtils in spark-streaming jar? Add this jar to your classpath : spark-streaming-flume_2.10-1.4.1.jar – Praneeth Reddy G Aug 03 '15 at 07:33
  • I gave --jars as spark-streaming-flume_2.10-1.2.1.jar then it gave some dependency with avro source that too I added and it worked. Meanwhile how I can check that the jar generated for app like simple-project-1.0.jar has what all dependency ???? Also can we gave these two jar at time of building the application itself so that same can be included in simple-project-1.0.jar itself ? – Prashant Agrawal Aug 04 '15 at 07:22
  • You could build an uber jar. But prefer jar management... May not want to ignore what your app depends on. – Praneeth Reddy G Aug 05 '15 at 19:25
0

Downloading spark-streaming-flume-assembly_2.10.jar from https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-flume-assembly , and running spark-submit with option --jars spark-streaming-flume-assembly_2.10.jar, should solve the problem of NoClassDefFound error, which arises due to runtime dependency resolution problem.