11

After installing jdk9 I have been seeing this problem:

$hive
Java HotSpot(TM) 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/Cellar/hive/2.3.1/libexec/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/Cellar/hadoop/2.8.1/libexec/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.ClassCastException: java.base/jdk.internal.loader.ClassLoaders$AppClassLoader cannot be cast to java.base/java.net.URLClassLoader
    at org.apache.hadoop.hive.ql.session.SessionState.<init>(SessionState.java:394)
    at org.apache.hadoop.hive.ql.session.SessionState.<init>(SessionState.java:370)
    at org.apache.hadoop.hive.cli.CliSessionState.<init>(CliSessionState.java:60)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:708)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:564)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:148)

But I have

Updated $PATH to point to java8

$java -version
java version "1.8.0_144"
Java(TM) SE Runtime Environment (build 1.8.0_144-b01)
Java HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode)

Updated hive executable to specify java8

vi $(which hive)

#!/bin/bash
JAVA_HOME="$(/usr/libexec/java_home --version 1.8)" HIVE_HOME="/usr/local/Cellar/hive/2.3.1/libexec" exec "/usr/local/Cellar/hive/2.3.1/libexec/bin/hive" "$@"

Verified the updated java version does point to jdk8

$/usr/libexec/java_home --version 1.8
/Library/Java/JavaVirtualMachines/jdk1.8.0_144.jdk/Contents/Home

What else should I be looking into here?

This is hive 2.3.1 on macos

 $hive --version
Hive 2.3.1
Git git://jcamachorodriguez-rMBP.local/Users/jcamachorodriguez/src/workspaces/hive/HIVE-apache/hive -r 7590572d9265e15286628013268b2ce785c6aa08
Compiled by jcamachorodriguez on Thu Oct 19 18:37:58 PDT 2017
From source with checksum 03c91029a6103bd91f25a6ff8a01fbcd
Naman
  • 27,789
  • 26
  • 218
  • 353
WestCoastProjects
  • 58,982
  • 91
  • 316
  • 560
  • 1
    Is there a newer version of Hive? MaxPermSize has been obsolete since JDK 8. The ClassCastException is Hive making an assumption that the system class loader is a URLClassLoader (this changed in JDK 9, see release notes here: http://www.oracle.com/technetwork/java/javase/9-relnote-issues-3704069.html#JDK-8142968). – Alan Bateman Dec 28 '17 at 08:58
  • 1
    As mentioned in the OP I updated to pointing to `JDK8` to avoid the `JDK9` issues that you mention. – WestCoastProjects Jan 03 '18 at 18:52

4 Answers4

4

Install jdk8 and change the path accordingly in hadoop-env.sh, which did the trick for me.

Legend-IDK
  • 49
  • 3
2

I was trying to install hive in ubuntu and ran into the same issue. Installed hadoop 3.3.6 and hive 3.1.3. I had both java 11 and java 8 installed side by side. I tried changing the JAVA_HOME in the shell but it did not work. The solution was to change the JAVA_HOME in hadoop_env.sh from java11 to java8, as Legend-IDK mentioned above. Now hive is working in my ubuntu laptop.

1

I have the same problem, I just remove JDK 9 instead of change the environment, and problem is solved.

See hive, it uses

JAVA_HOME="$(/usr/libexec/java_home --version 1.7+)" 
HIVE_HOME="/usr/local/Cellar/hive/2.3.1/libexec" 
exec "/usr/local/Cellar/hive/2.3.1/libexec/bin/hive" "$@".
clemens
  • 16,716
  • 11
  • 50
  • 65
player six
  • 11
  • 2
  • 2
    I have a need for the JDK9 to be available on the system. Maybe this approach can work as a workaround - but hope there were a configuration based solution. – WestCoastProjects Jan 03 '18 at 18:51
0

I was facing same error while setting up hive, initially I thought of it could be because of different java versions.

But on checking java version, it was jdk 1.8.

Finally on checking jdk installed directory (/Library/Java/JavaVirtualMachine), I found jdk 1.8 and jdk 10 both were present

I removed jdk 10, and finally it worked.