I have twitter data stored in hdfs path. I am able to read the data with spark dataframe as:
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
val df= hiveContext.read.json("/nifi/data/twitter/")
df.printSchema
and df.show
commands show the result without any issue.
but when I am trying to store the data frame to hive table, I am facing below errors:
df.write.saveAsTable("tweets_32")
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException): No lease on /apps/hive/warehouse/tweets_32/_temporary/0/_temporary/attempt_201809260508_0002_m_000002_0/part-r-00002-c204b592-dc2a-4b2f-bc39-54afb237a6cb.gz.parquet (inode 1173647): File does not exist. [Lease. Holder: DFSClient_NONMAPREDUCE_14557453_1, pendingcreates: 1]>
Could someone let me know,what could be the reason for this?