I am trying to run a job via spark-submit
.
The error that results from this job is:
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2625)
at java.lang.Class.getMethod0(Class.java:2866)
at java.lang.Class.getMethod(Class.java:1676)
at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 6 more
Not sure if it matters, but I am trying to run this job within a Docker container on Mesos. Spark is 1.61, Mesos is 0.27.1, Python is 3.5, and Docker is 1.11.2. I am running in client mode.
Here is the gist of my spark-submit
statement:
export SPARK_PRINT_LAUNCH_COMMAND=true
./spark-submit \
--master mesos://mesos-blahblahblah:port \
--conf spark.mesos.executor.docker.image=docker-registry:spark-docker-image \
--conf spark.mesos.executor.home=/usr/local/spark \
--conf spark.executorEnv.MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.dylib \
--conf spark.shuffle.service.enabled=true \
--jars ~/spark/lib/slf4j-simple-1.7.21.jar \
test.py
The gist of test.py
is that it loads data from parquet, sorts it by a particular column, and then writes it back to parquet.
I added the --jars
line when I kept getting that error (the error is not appearing in my driver - I navigate through the Mesos Framework to look at the stderr from each Mesos task to find it)
I also tried adding --conf spark.executor.extraClassPath=http:some.ip:port/jars/slf4j-simple-1.7.21.jar
,
because I noticed when I ran the spark-submit
from above it would output
INFO SparkContext: Added JAR file:~/spark/lib/slf4j-simple-1.7.21.jar at http://some.ip:port/jars/slf4j-simple-1.7.21.jar with timestamp 1472138630497
But the error is unchanged. Any ideas?
I found this link, which makes me think it is a bug. But the person hasn't posted any solution.