I am running below code in spark to create table temp1 with number of parttion 200 . But while i am checking the actual number of partition by creating an rdd out of temp1 table its coming to be more than 200. How is this possible , am i missing any thing .It would be really helpful if any one can tell me ,if i am missing any thing !! Thanks
val TransDataFrame = hiveContext.sql(
s""" SELECT *
FROM uacc.TRANS
WHERE PROD_SURRO_ID != 0
AND MONTH_ID >= 201401
AND MONTH_ID <= 201403
AND CRE_DT <= '2016-11-13'
""").repartition(200,$"NDC").registerTempTable("temp")
hiveContext.sql(
s"""
CREATE TABLE uacc.temp1
AS SELECT * FROM temp
""")
val df = hiveContext.sql("SELECT * FROM uacc.temp1")
df.rdd.getNumPartitions
1224