0

I have started the spark thrift server on port 10015. It started successfully and also I can connect to metastore,but I'm unable to use beeline to connect to Spark thrift server. Please tell me if I have missed anything.

-------Process I followed is below ------------

[root@hadoopdashuju009154 bin]# netstat -an|grep 10015

tcp 0 0 10.2.9.154:10015 0.0.0.0:* LISTEN
[root@hadoopdashuju009154 bin]#

- my hive-site.xml under the spark_home conf directory

 <property>
   <name>hive.server2.thrift.port</name>
   <value>10015</value>
   <description>
    Port number of HiveServer2 Thrift interface.  Can be overridden by 
setting $HIVE_SERVER2_THRIFT_PORT
    </description>
  </property>
 <property>
   <name>hive.server2.thrift.bind.host</name>
   <value>10.2.9.154</value>
   <description>Bind host on which to run the HiveServer2 Thrift interface. 
Can be overridden by setting $HIVE_SERVER2_THRIFT_BIND_HOST</description>
 </property>

- error when i connect to spark thrift server using beeline

[root@hadoopdashuju009154 bin]# ./beeline -u jdbc:hive2://hadoopdashuju009154.ppdgdsl.com:10015

scan complete in 2ms

Connecting to jdbc:hive2://hadoopdashuju009154.ppdgdsl.com:10015

18/05/23 17:28:57 INFO jdbc.Utils: Supplied authorities: 
hadoopdashuju009154.ppdgdsl.com:10015

18/05/23 17:28:57 INFO jdbc.Utils: Resolved authority: 

hadoopdashuju009154.ppdgdsl.com:10015 
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge.getCanonicalHostName(Ljava/lang/String;)Ljava/lang/String;

Beeline version 1.1.0-cdh5.14.0 by Apache Hive
0: jdbc:hive2://hadoopdashuju009154.ppdgdsl.c (closed)> 

[root@hadoopdashuju009154 bin]#

aksss
  • 333
  • 5
  • 17
hu li
  • 11
  • 2
  • you can find your answer here - https://stackoverflow.com/questions/28898936/beeline-not-able-to-connect-to-hiveserver2?rq=1 – aksss May 23 '18 at 11:15
  • I mean use spark thrift server instead of hive thrift server,that seems not ok – hu li May 23 '18 at 12:48
  • Is there any one can give me any sugguestion? – hu li May 24 '18 at 01:40
  • I have solved that by establish another spark without CDH and download the missing jar and put it to local spark.Thanks – hu li May 24 '18 at 08:14

0 Answers0