I do not have Java 1.8 installed on my machine and somehow I am receiving this error. Other stackoverflow solutions propose to use Java 1.8 to compile the code. I want to understand the reason for the following error.
Exception in thread "main" java.lang.UnsupportedClassVersionError: com/typesafe/config/ConfigValue : Unsupported major.minor version 52.0
Here is my code:
pom.xml
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.1</version>
</dependency>
Scala Code
object SparkWithHbase {
def main(args: Array[String]) {
System.out.println("Java Version: " + System.getProperty("java.version"))
//Initiate spark context with spark master URL. You can modify the URL per your environment.
val sparkConf = new SparkConf().setAppName("Spark Hbase").setMaster("spark://10.41.50.126:7077")
val sc = new SparkContext(sparkConf) // Failing at this line
}