1

When run the spark example:

spark-hive-tables , I get errors on hadoop UI

User class threw exception: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

and warning

executor.CoarseGrainedExecutorBackend: An unknown (x.x.x.x:x) driver disconnected.

but I have start hive metastore on my spark-yarn cluster, what should I do?

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
paxi
  • 21
  • 1
  • 5
  • Please take a look at this : https://stackoverflow.com/questions/22711364/java-lang-runtimeexceptionunable-to-instantiate-org-apache-hadoop-hive-metastor – philantrovert Jul 04 '17 at 09:41
  • there is no metastore_db/*.lck under my spark directory ,hive.metastore.schema.verification is false, and i am using mysql – paxi Jul 04 '17 at 10:46

2 Answers2

0

It means you haven't started your Metastore service, so start you metastore service where you installed hive or in the remote if you have your metastore in remote.

To start metastore use hive --service metastore

what output you got after starting metastore service

Pyd
  • 6,017
  • 18
  • 52
  • 109
  • i hava metastore running on each machine of my cluster -Xmx256m -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/data/opt/hadoop-2.6.0/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/data/opt/hadoop-2.6.0 -Dhadoop.id.str=root -Dhadoop.root.logger=INFO,console -Djava.library.path=/data/opt/hadoop-2.6.0/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx512m -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /data/opt/apache-hive-1.2.1-bin/lib/hive-service-1.2.1.jar org.apache.hadoop.hive.metastore.HiveMetaStore – paxi Jul 04 '17 at 10:49
0

I found out that i am using thrift server.after starting thrift by cmd /SPARKPATH/sbin/start-thriftserver.sh ,here comes another error "java.lang.ClassNotFoundException: org.datanucleus.api.jdo.JDOPersistenceManagerFactory" , which display errors like my title here. and it can be fixed by add --jars /SPARKPATH/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar,/SPARKPATH/lib_managed/jars/datanucleus-core-3.2.10.jar,/SPARKPATH/lib_managed/jars/datanucleus-rdbms-3.2.9.jar

paxi
  • 21
  • 1
  • 5