0

I am trying to query a table stored in Hive. Below is my QL. I want to to store the result back into a existing hive table into a new partition. My last line of code is creating a new table. While writing output as a file , it is storing a parquet file but i was not able to read through hive. Can you please help.

My Target table:

CREATE EXTERNAL TABLE dq_reslt_detl_master(
DQ_CHECK_ID string, PK_1 string,
PK_2 int,
D_RUNTIME string) PARTITIONED BY (
eap_as_of_dt string) stored as parquetfile LOCATION '/data/test/dq_reslt_detl_master'

from pyspark import sql from pyspark.sql import SQLContext, Row,HiveContext from pyspark.sql.types import *

    sqlContext=HiveContext(sc)
    dfsql=sqlContext.sql("""select * from l1_amlmkt_mdwe.mdw_atlas_te   """)
    dfsql.registerTempTable("tmp_mdw_atlas_te")
    dfsql_=dfsql_Cache.count()
    l1=['trd_ex_event_nb']

    i_detl_all=[]
    for i in l1:
        i_summ_1_sql="select count(*) from amlmkt.k where {0} is null or {0} =''   ".format(i)
        i_detl_1_sql="select x,y,from_unixtime(unix_timestamp())as exe_time from l1_amlmkt_mdwe.mdw_atlas_te where {0} is null or {0} ='' ".format(i)
        i_detl_2=sqlContext.sql(i_detl_1_sql)
        i_summ_2=sqlContext.sql(i_summ_1_sql)
        i_detl_2.write.saveAsTable("dq_result")
zero323
  • 322,348
  • 103
  • 959
  • 935
user3858193
  • 1,320
  • 5
  • 18
  • 50
  • Possible duplicate of [Hive doesn't read partitioned parquet files generated by Spark](http://stackoverflow.com/questions/33551878/hive-doesnt-read-partitioned-parquet-files-generated-by-spark) – zero323 Jun 30 '16 at 11:10
  • I don't want to write in parquet format. I want to save the rdd into a table which is all ready created. The first write is working fine but it is creating a table on the fly. I need the 2nd one which should store my result in allready created table .i_detl_2.write.mode("append").saveAsTable("l1_amlmkt_mdwe.dq_result") i_detl_2.write.mode("append").format("text").save("/data/test/dq_reslt_detl_master/eap_as_of_dt=20150501/") – user3858193 Jun 30 '16 at 11:31

0 Answers0