1

it seems, in docker pyspark shell in local-client mode is working and able to connect to hive. However, issuing spark-submit with all dependencies it fails with below error.

20/08/24 14:03:01 INFO storage.BlockManagerMasterEndpoint: Registering block manager test.server.com:41697 with 6.2 GB RAM, BlockManagerId(3, test.server.com, 41697, None)
20/08/24 14:03:02 INFO hive.HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
20/08/24 14:03:02 INFO hive.metastore: Trying to connect to metastore with URI thrift://metastore.server.com:9083
20/08/24 14:03:02 ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)

Running a simple pi example on pyspark works fine with no kerberos issues, but when trying to access hive getting kerberos error.

Spark-submit command:

spark-submit --master yarn --deploy-mode cluster --files=/etc/hive/conf/hive-site.xml,/etc/hive/conf/yarn-site.xml,/etc/hive/conf/hdfs-site.xml,/etc/hive/conf/core-site.xml,/etc/hive/conf/mapred-site.xml,/etc/hive/conf/ssl-client.xml  --name fetch_hive_test --executor-memory 12g --num-executors 20 test_hive_minimal.py

test_hive_minimal.py is a simple pyspark script to show tables in test db:

from pyspark.sql import SparkSession
#declaration
appName = "test_hive_minimal"
master = "yarn"
# Create the Spark session
sc = SparkSession.builder \
    .appName(appName) \
    .master(master) \
    .enableHiveSupport() \
    .config("spark.hadoop.hive.enforce.bucketing", "True") \
    .config("spark.hadoop.hive.support.quoted.identifiers", "none") \
    .config("hive.exec.dynamic.partition", "True") \
    .config("hive.exec.dynamic.partition.mode", "nonstrict") \
    .getOrCreate()
# Define the function to load data from Teradata
#custom freeform query
sql = "show tables in user_tables"
df_new = sc.sql(sql)
df_new.show()
sc.stop()

Can anyone throw some light how to fix this? Isnt kerberos tickets managed automatically by yarn? all other hadoop resources are accessible.

UPDATE: Issue was fixed after sharing vol mount on the docker container and passing keytab/principal along with hive-site.xml for accessing metastore.

spark-submit --master yarn \
--deploy-mode cluster \
--jars /srv/python/ext_jars/terajdbc4.jar \
--files=/etc/hive/conf/hive-site.xml \
--keytab /home/alias/.kt/alias.keytab \ #this is mounted and kept in docker local path 
--principal alias@realm.com.org \
--name td_to_hive_test \
--driver-cores 2 \
--driver-memory 2G \
--num-executors 44 \
--executor-cores 5 \
--executor-memory 12g \
td_to_hive_test.py
StrangerThinks
  • 246
  • 4
  • 14

2 Answers2

1

I think that your driver have tickets but that not the case of your executors. Add the following parameters to your spark submit :

  • --principal : you can get principal this way : klist -k
  • --keytab : path to keytab

more informations : https://spark.apache.org/docs/latest/running-on-yarn.html#yarn-specific-kerberos-configuration

maxime G
  • 1,660
  • 1
  • 10
  • 27
  • Thank you. I did try passing my own principal and keytab as params to the spark-submit but it was giving a different error. afair it was something like 'TOKEN KIND error'. is there any particular format I am missing? Also I am using pyspark 2.4.6 with cdh 5.13 which has both spark 1.6 and 2. the documentation is for 3.0.0. – StrangerThinks Aug 24 '20 at 22:55
0

Can you try below command line property while running a job on the cluster.

-Djavax.security.auth.useSubjectCredsOnly=false

You can add above property to Spark submit command

Vijay_Shinde
  • 1,332
  • 2
  • 17
  • 38
  • Hey Vijay, how do you want it passed? as in --conf spark.executor.extraJavaOptions or spark.driver.extraJavaOptions? or as --driver-java-options "-Djavax.security.auth.useSubjectCredsOnly=false" ? I am trying it now – StrangerThinks Aug 25 '20 at 14:26
  • I tried running with both executor and driver extraJavaOpts but getting below error : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Attempt to obtain new INITIATE credentials failed! (null))] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) – StrangerThinks Aug 25 '20 at 15:13