0

I do not have Java 1.8 installed on my machine and somehow I am receiving this error. Other stackoverflow solutions propose to use Java 1.8 to compile the code. I want to understand the reason for the following error.

Exception in thread "main" java.lang.UnsupportedClassVersionError: com/typesafe/config/ConfigValue : Unsupported major.minor version 52.0

Here is my code:

pom.xml

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.10</artifactId>
  <version>1.5.1</version>
</dependency>

Scala Code

object SparkWithHbase {
  def main(args: Array[String]) {
    System.out.println("Java Version: " + System.getProperty("java.version"))

    //Initiate spark context with spark master URL. You can modify the URL per your environment.
    val sparkConf = new SparkConf().setAppName("Spark Hbase").setMaster("spark://10.41.50.126:7077")
    val sc = new SparkContext(sparkConf) // Failing at this line
}
Sachin Jain
  • 21,353
  • 33
  • 103
  • 168
  • 1
    You need to check the version of your spark cluster. – eliasah Jun 06 '16 at 05:43
  • 1
    @eliasah I don't think it has anything to do with java version of spark cluster because when I change the IP of spark cluster in my code to invalid IP, it still throws the same error. – Sachin Jain Jun 06 '16 at 06:44
  • As @eliasah pointed out this is the issue with version mismatch of your spark dependencies and spark cluster. – Amit Kumar Jun 06 '16 at 08:06
  • Earlier I was running this code in IDE which is giving me this problem. Here is how I solved it. I added maven-shade-plugin to create an uber jar with all dependencies and I used spark-submit script to submit the uber jar and execute the job. This indeed worked fine. Just a helpful note to whoever gets stuck in the same problem. Here is the code: https://github.com/practicebook/hbase-sandbox – Sachin Jain Jun 07 '16 at 00:57

0 Answers0