6

I am trying to run a MapReduce job with Hadoop, YARN and Accumulo.

I am getting the following output that I cannot track down the issue. Looks to be a YARN issue, but I am not sure what it is looking for. I have a nmPrivate folder at location $HADOOP_PREFIX/grid/hadoop/hdfs/yarn/logs. Is this the folder it says that it cannot find?

14/03/31 08:48:46 INFO mapreduce.Job: Job job_1395942264921_0023 failed with state FAILED due to: Application application_1395942264921_0023 failed 2 times due to AM Container for appattempt_1395
942264921_0023_000002 exited with  exitCode: -1000 due to: Could not find any valid local directory for nmPrivate/container_1395942264921_0023_02_000001.tokens
.Failing this attempt.. Failing the application.
bdparrish
  • 3,216
  • 3
  • 37
  • 58

1 Answers1

1

When i test the spark-submit-on-yarn in the cluster mode:

spark-submit --master yarn --deploy-mode cluster --class org.apache.spark.examples.SparkPi /usr/local/install/spark-2.2.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.2.0.jar 100

i gotten the same error:

Application application_1532249549503_0007 failed 2 times due to AM Container for appattempt_1532249549503_0007_000002 exited with exitCode: -1000 Failing this attempt.Diagnostics: java.io.IOException: Resource file:/usr/local/install/spark-2.2.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.2.0.jar changed on src filesystem (expected 1531576498000, was 1531576511000

there have one sugesstion to desolve this kind of error,to revise your core-site.xml or other conf of the HADOOP.

Finally, i fixed the error by set the property fs.defaultFS in the the $HADOOP_HOME/etc/hadoop/core-site.xml