0

I have the second question to my Maven-Spark problem. I can now compile all classes and also execute non-Spark classes. By the execution of some spark-class I have the next error message:

Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/mllib/linalg/DenseMatrix
    at java.lang.Class.getDeclaredMethods0(Native Method)
    at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
    at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
    at java.lang.Class.getMethod0(Class.java:3018)
    at java.lang.Class.getMethod(Class.java:1784)
    at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.mllib.linalg.DenseMatrix
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 7 more

Obviously the Java doesn't find the package of Spark. I suppose that I execute my Spark file in the wrong place, where my Spark installation is unvisible... I also have found the similar topics by SE. After that I have understood that I need to specify the CLASSPATH in my .jar file. Usually I create my .jar file with the command:

mvn package

My question is, how I can specify the CLASSPATH for my .jar explicitly (for several Spark classes)? Thank you for you time.

Additional:

my pom.xml file

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>Prototype</groupId>
  <artifactId>TestMaven</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <packaging>jar</packaging>

  <name>Test</name>
  <url>http://maven.apache.org</url>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
   <maven.compiler.source>1.8</maven.compiler.source>
   <maven.compiler.target>1.8</maven.compiler.target>
  </properties>


 <dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.4.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>1.4.0</version>
    </dependency>
    <dependency>
        <groupId>com.databricks</groupId>
        <artifactId>spark-csv_2.10</artifactId>
        <version>1.0.3</version>
    </dependency>
    <dependency>
        <groupId>com.sparkjava</groupId>
        <artifactId>spark-core</artifactId>
        <version>2.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.commons</groupId>
        <artifactId>commons-math3</artifactId>
        <version>3.5</version>
    </dependency>
    <dependency>
        <groupId>com.sparkjava</groupId>
        <artifactId>spark-template-freemarker</artifactId>
        <version>2.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.google.code.gson</groupId>
        <artifactId>gson</artifactId>
        <version>2.3.1</version>
    </dependency>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>
  </dependencies>
</project>

Update: I have found a solution and can execute my Spark programm regarding spark documentation

Community
  • 1
  • 1
Guforu
  • 3,835
  • 8
  • 33
  • 52
  • 1
    You don't, maven must manage that for you. And probably it is managing it for you and something else is wrong, for example in the way you run the application. It is also a rule of thumb to make questions stand on their own, so I would replicate the pom here rather than assume people will see it in the question you linked to. – Gimby Oct 27 '15 at 08:32
  • The title of the question, should always explain what the details is about. In a community, we encourage questions and answers for two reasons. 1. You will get some help. 2. If someone else has similar issue can see your question. In this case nobody will have an idea when you say maven with spark, part 2. – NewUser Oct 27 '15 at 08:33
  • Can you please update the title for the other question as well? – NewUser Oct 27 '15 at 08:36
  • I have updated the both titles – Guforu Oct 27 '15 at 08:38

2 Answers2

0

Your dependencies are not included in your jar file. You can use the maven assembly plugin to include all your dependencies.

Put this in your pom.xml:

<build>
        <sourceDirectory>src/main/java</sourceDirectory>
        <testSourceDirectory>src/test/java</testSourceDirectory>
        <plugins>
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <descriptor>assembly.xml</descriptor>
                </configuration>
            </plugin>
        </plugins>
        <resources>
            <resource>
                <directory>resources</directory>
            </resource>
        </resources>
    </build>

And put assembly.xml in your resources folder (usually under src/resources):

<assembly xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.3"
          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.3 http://maven.apache.org/xsd/assembly-1.1.3.xsd">
    <!-- TODO: a jarjar format would be better -->
    <id>jar-with-dependencies</id>
    <formats>
        <format>jar</format>
    </formats>
    <includeBaseDirectory>false</includeBaseDirectory>
    <dependencySets>
        <dependencySet>
            <outputDirectory>/</outputDirectory>
            <useProjectArtifact>true</useProjectArtifact>
            <unpack>true</unpack>
            <scope>runtime</scope>
        </dependencySet>
    </dependencySets>
</assembly>
Avihoo Mamka
  • 4,656
  • 3
  • 31
  • 44
0

to run spark applications you need to create an uber jar that includes your application and all your dependencies classes. to do that you can use maven-shade plugin

inlcude this pulgin in your pom file and repackage your application. make sure you remove the assembly plugin if you are using it

<plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <configuration>
                <transformers>
                    <transformer
                        implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                        <mainClass>[YOUR MAIN CLASS]</mainClass>
                    </transformer>
                </transformers>
                <shadedArtifactAttached>true</shadedArtifactAttached>
                <shadedClassifierName>launcher</shadedClassifierName>
                <keepDependenciesWithProvidedScope>false</keepDependenciesWithProvidedScope>
            </configuration>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                </execution>
            </executions>
        </plugin> 
Akrem
  • 90
  • 1
  • 5