0

I run Hadoop-2.2.0 + Hive-0.13.0 on a cluster with 5 datanodes. WordCount example succeeds running and it's ok to create table in hive cli. But when i run hive query with mapreduce jobs, then i keep getting errors like:

Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: java.io.FileNotFoundException: HIVE_PLAN7b8ea437-8ec3-4c05-af4e-3cd6466dce85 (No such file or directory)
    at org.apache.hadoop.hive.ql.exec.Utilities.getMapRedWork(Utilities.java:230)
    at org.apache.hadoop.hive.ql.io.HiveInputFormat.init(HiveInputFormat.java:255)
    at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:381)
    at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:374)
    at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:540)
    at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
Caused by: java.io.FileNotFoundException: HIVE_PLAN7b8ea437-8ec3-4c05-af4e-3cd6466dce85 (No such file or directory)
    at java.io.FileInputStream.open(Native Method)
    at java.io.FileInputStream.<init>(FileInputStream.java:146)
    at java.io.FileInputStream.<init>(FileInputStream.java:101)
    at org.apache.hadoop.hive.ql.exec.Utilities.getMapRedWork(Utilities.java:221)
    ... 12 more


FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched:
Job 0: Map: 1   HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec

thanks in advance!

akash
  • 22,664
  • 11
  • 59
  • 87

1 Answers1

0

I finally find the problem: I run shark-0.9.1 on the same cluster which is compiled with hive-0.11. When yarn starts, It reads hive-0.11 jar files which lead to the error!!

I've removed shark classpath from yarn.application.classpath in yarn-site.xml, and the error fixed!

thank you