I use sparkR to run some sql query to get sparkR data frame like the following
data = sql(sql_query)
And I can get the dimension of data by using dim(data)
However, when I want to take a look at the data by using head(data)
it will fail and give me an error java.lang
.ClassCastException: org.apache.hadoop.hive.serde2.io.TimestampWritable cannot be
cast to org.apache.hadoop.io.IntWritable
I tried the sql query in Hive and it doesn't have any problem. The weird thing here is I can get the dimension but cannot get the head.
Any idea?