1

I am trying to run a spark process from my Spring web app and I am receiving this error:

java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword(Ljava/lang/String;)[C
    at org.apache.spark.SSLOptions$.$anonfun$parse$8(SSLOptions.scala:188) ~[spark-core_2.12-2.4.3.jar:2.4.3]
    at scala.Option.orElse(Option.scala:306) ~[scala-library-2.12.8.jar:na]
    at org.apache.spark.SSLOptions$.parse(SSLOptions.scala:188) ~[spark-core_2.12-2.4.3.jar:2.4.3]
    at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:117) ~[spark-core_2.12-2.4.3.jar:2.4.3]
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:236) ~[spark-core_2.12-2.4.3.jar:2.4.3]
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:185) ~[spark-core_2.12-2.4.3.jar:2.4.3]
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) ~[spark-core_2.12-2.4.3.jar:2.4.3]
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:424) ~[spark-core_2.12-2.4.3.jar:2.4.3]
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520) ~[spark-core_2.12-2.4.3.jar:2.4.3]
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:935) ~[spark-sql_2.12-2.4.2.jar:2.4.2]
    at scala.Option.getOrElse(Option.scala:138) ~[scala-library-2.12.8.jar:na]
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) ~[spark-sql_2.12-2.4.2.jar:2.4.2]
Jeff
  • 3,712
  • 2
  • 22
  • 24
  • https://stackoverflow.com/questions/35186/how-do-i-fix-a-nosuchmethoderror – Stephan Hogenboom May 09 '19 at 11:32
  • 1
    Comparable to https://stackoverflow.com/questions/49059136/nifi-java-lang-nosuchmethoderror-org-apache-hadoop-conf-configuration-reloadexi Check the versions of the different libraries in your classpath: hadoop, spark – Conffusion May 09 '19 at 11:35

3 Answers3

2

Issue is clear indication of hadoop libraries version mismatch.

Moreover, I faced same issue and since i am using maven I commented below. since i am not using. it solved the issue

<!--<dependency>-->
      <!--<groupId>org.apache.hadoop</groupId>-->
      <!--<artifactId>hadoop-client</artifactId>-->
      <!--<version>${hadoop.version}</version>-->
      <!--<scope>provided</scope>-->
    <!--</dependency>-->
Ram Ghadiyaram
  • 28,239
  • 13
  • 95
  • 121
1

The versions of hadoop is different , make sure that it is same in the dependencies of hadoop.

Tutu Kumari
  • 485
  • 4
  • 10
1

I was also getting the same error with Hadoop-2.7.2 and Spark-2.3.1

I resolved the issue by removing old date JAR files from C:\work\spark-2.3.1-bin-hadoop2.7\jars. I had all jars with 2018 and 2019 date modified.

I removed all 2018 files and kept only 2019 files.

Cheers!!!Issue is resolved!!!!!!

GokuVegeta
  • 31
  • 8