1

I was able to create an executable using launch4j and it works fine on my machine. When I send it to someone to run on their windows machine they get the following error:

Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: spark/TemplateEngine
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Unknown Source)
        at java.lang.Class.privateGetMethodRecursive(Unknown Source)
        at java.lang.Class.getMethod0(Unknown Source)
        at java.lang.Class.getMethod(Unknown Source)
        at sun.launcher.LauncherHelper.validateMainClass(Unknown Source)
        at sun.launcher.LauncherHelper.checkAndLoadMain(Unknown Source)
Caused by: java.lang.ClassNotFoundException: spark.TemplateEngine
        at java.net.URLClassLoader.findClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        ... 7 more

Any thoughts?

Zohaib Zaidi
  • 288
  • 4
  • 11

3 Answers3

2

This exception may also occur if your maven dependencies do not have <scope> set to compile time (default) and dependency jars are not available at compile time. For instance if maven dependencies have <scope>provided</scope> in your pom.xml compiler will assume that the JRE/environment will provide these dependency jars but when the sources are compiled and these dependencies are not found, this exception will be thrown.

For example - below may lead to this exception if spark-mllib_2.11 dependency is not found during compile time although they are added and there is no error during editing;

<dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-mllib_2.11</artifactId>
   <version>2.2.0</version>
   <scope>provided</scope>
</dependency>
khawarizmi
  • 593
  • 5
  • 19
1

I had this problem because my version of java that was as default was 9, and somehow Spark didn't recognize it. So I changed to version 8 and it worked. To change in linux :

sudo update-java-alternatives -s java-1.8.0-openjdk-amd64

In your case you may want another version, so choose yours (to list the versions you have in your computer use the -l option).

Andrelbol
  • 11
  • 1
0

Had to set relative paths in the class path so the executable could locate the jar files

Zohaib Zaidi
  • 288
  • 4
  • 11