3

Issue:

Hive was working from cli , When I am starting hive using beeline :

beeline -u "jdbc:hive2://********:10000/default" -n **** -p **** -d "org.apache.hive.jdbc.HiveDriver"

I am getting the following exception:

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Error: java.io.IOException: com.google.protobuf.ServiceException: java.lang.OutOfMemoryError: GC overhead limit exceeded (state=,code=0)
Beeline version 0.14.0 by Apache Hive

What I tried:

I know restarting the hive service will not solve the purpose, but just gave a try and no luck.

I Tried to clear tmp :

rm -rf *

Now hive cli not also fails to start with the following error :

$ hive
Error creating temp dir in java.io.tmpdir /tmp due to Read-only file system

Any specific hive properties that I need to look into ?

Please guide me. Any help would be appreciated

Aditya
  • 2,385
  • 18
  • 25
  • Possible duplicate of [Error java.lang.OutOfMemoryError: GC overhead limit exceeded](http://stackoverflow.com/questions/1393486/error-java-lang-outofmemoryerror-gc-overhead-limit-exceeded) – Ravindra babu Aug 11 '16 at 05:59
  • You can find many solutions in above linked question – Ravindra babu Aug 11 '16 at 05:59
  • my case is , I am getting the exception while starting hive. Please suggest the needful. The suggested post shows how to solve it during running a java program – Aditya Aug 11 '16 at 06:05
  • Could you increase heap size and then start fresh again. – Bector Aug 11 '16 at 06:47

1 Answers1

3

@Addy, You can try below mentioned solution. It should work for you.

if [ "$SERVICE" = "cli" ]; then
  if [ -z "$DEBUG" ]; then
    export HADOOP_OPTS="$HADOOP_OPTS -XX:NewRatio=12 -Xmx12288m -Xms10m -XX:MaxHeapFreeRatio=40 -XX:MinHeapFreeRatio=15 -XX:+useParNewGC -XX:-useGCOverheadLimit"
  else
    export HADOOP_OPTS="$HADOOP_OPTS -XX:NewRatio=12 -Xmx12288m -Xms10m -XX:MaxHeapFreeRatio=40 -XX:MinHeapFreeRatio=15 -XX:-useGCOverheadLimit"
  fi
fi

export HADOOP_HEAPSIZE=2048

For more details you can also visit Cloudera blog for Managing Hive.

Bector
  • 1,324
  • 19
  • 35
  • 1
    Thank you Bector, When restarted the server , and started the processes freshly, the error vanished but will implement your suggestion for all the future runs at the time of starting hive – Aditya Aug 11 '16 at 13:29
  • Note: Not sure on whether it's different Java version or Hive, but in my case ``-XX:-useGCOverheadLimit`` required correction to ``-XX:-UseGCOverheadLimit`` with capital ``U``. – runr Jun 28 '18 at 08:23