Getting the below exception while using Spark 2.0 and JSCH for file transmission. Spark lib uses 0.1.42 version and also came across option to override the built-in dependency.
It works fine in client mode, but when submitting to Yarn cluster, it fails with the below exception. Any suggestions, pls.
1Connecting to xxxxxxxxxxxx.com port 22
1Connection established
1Remote version string: SSH-2.0-OpenSSH_7.4
1Local version string: SSH-2.0-JSCH-0.1.42
1CheckCiphers: aes256-ctr,aes192-ctr,aes128-ctr,aes256-cbc,aes192-cbc,aes128-cbc,3des-ctr,arcfour,arcfour128,arcfour256
1SSH_MSG_KEXINIT sent
1SSH_MSG_KEXINIT received
1Disconnecting from xxxxxxxxxxxxxxx.com port 22
18/09/26 10:56:50 ERROR Myclass: Error occurred in processing:
com.jcraft.jsch.JSchException: Algorithm negotiation fail
com.jcraft.jsch.JSchException: Algorithm negotiation fail
at com.jcraft.jsch.Session.receive_kexinit(Session.java:520)
at com.jcraft.jsch.Session.connect(Session.java:286)
at com.jcraft.jsch.Session.connect(Session.java:150)
If i try the overriding option to load the jars given in the spark submit command, then i get:
java.lang.LinkageError: loader constraint violation: when resolving method "org.slf4j.impl.StaticLoggerBinder.getLoggerFactory()Lorg/slf4j/ILoggerFactory;" the class loader (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current class, org/slf4j/LoggerFactory, and the class loader (instance of sun/misc/Launcher$AppClassLoader) for the method's defining class, org/slf4j/impl/StaticLoggerBinder, have different Class objects for the type org/slf4j/ILoggerFactory used in the signature
at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:418)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
at org.apache.commons.logging.LogFactory$Slf4jDelegate.createLocationAwareLog(LogFactory.java:174)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:111)
There is a question already here - JSch Algorithm negotiation fail but that looks more from Java point of view, but am looking from Yarn and Spark.