0

I am new to spark. Trying to do setup in my local machine (mac), which has java version 1.7.0_80.

Followed these steps :

  1. Downloaded apache spark, version spark-2.3.1-bin-hadoop2.6.tgz, from https://spark.apache.org/downloads.html.
  2. Untared the file, renamed the folder to spak and moved it to /usr/local/spark
  3. in /usr/local/spark, ran this command :bin/spark-shell

Got this java error :

Rajeev: spark rajeevnair$ bin/spark-shell

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 52.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Rajeev A Nair
  • 191
  • 2
  • 4
  • 11

1 Answers1

1

You need Java 8 to run Spark with a version greater than 2.2.0. See the Spark documentatin for details:

Building Spark using Maven requires Maven 3.3.9 or newer and Java 8+. Note that support for Java 7 was removed as of Spark 2.2.0.

So your options are upgrading Java or using an older version of Spark.

Harald Gliebe
  • 7,236
  • 3
  • 33
  • 38