0

I have a spark project written I java 8 that I want to deploy on ec2. I've run into the following problem:

Unsupported major.minor version 52.0

This can be fixed by turning the project to java 7. But to do so I need to fix a lot of things.

Is there any way to make spark use java 8 ? (I have java-8 installed on all the machines in the cluster)

Thanks !

Tomy
  • 29
  • 2
  • 10
  • Classfiles compiled for Java 7 should work seemlessly on Java 8 (the opposite is not true and can result in errors like you describe). Are you absolutely sure that Java 8 is installed *and used* everywhere? – Jakob Odersky Mar 18 '16 at 23:03
  • export JAVA_HOME=/usr/lib/jvm/java-1.8.0 – G B Jun 18 '16 at 09:02
  • and use --conf spark.executorEnv.JAVA_HOME=/usr/lib/jvm/java-1.8.0 option for spark-submit. This assumes of course Java8 was installed. – G B Jun 18 '16 at 09:03

2 Answers2

1

This error straight forward means, java 7 is in use on your ec2 machine. use "alternatives --config java" to specify which version of java should be used on your machine (on linux), if you have multiple java installed.

Also, remember to set java environment variables.

Abhishek Anand
  • 1,940
  • 14
  • 27
0

I have seen this error before and it has always been caused by two different versions of the same component being used. You will need to check the components included.

Vod
  • 106
  • 2
  • Downgrading to Java 7 meant I wouldn't get that error anymore, but I'd rather not downgrade. – Tomy Mar 18 '16 at 20:39