0

I'm trying to start a MapReduce job from java. But when I try to submit the job I get Permission Denied exception. I'm able to run hdfs dfs -ls / from command line without any error. But it doesn't work from my java program.

Here's my code

public static void main(String[] args) {
    Configuration conf=new Configuration();

    conf.set("mapreduce.map.class","org.apache.hadoop.conf.TestMapper");
    conf.set("mapreduce.reduce.class","org.apache.hadoop.conf.TestReducer");

    conf.set("mapreduce.framework.name","yarn");

    conf.set("hadoop.security.group.mapping","org.apache.hadoop.security.ShellBasedUnixGroupsMapping");

    conf.set("fs.default.name","hdfs://master:9000");

    conf.set("dfs.permission","false");

    conf.set("yarn.nodemanager.aux-services","mapreduce_shuffle");
    conf.set("yarn.resourcemanager.resource-tracker.address","master:8025");
    conf.set("yarn.resourcemanager.scheduler.address","master:8030");
    conf.set("yarn.resourcemanager.address","master:8040");
    conf.set("yarn.nodemanager.localizer.address","master:8060");
    Job job=null;
    try {
        job = Job.getInstance(conf, "Test Map Reduce");

        job.setJarByClass(RunJob.class);

        job.setOutputKeyClass(LongWritable.class);
        job.setOutputValueClass(Text.class);

        job.setInputFormatClass(TextInputFormat.class);
        job.setOutputFormatClass(TextOutputFormat.class);

        TextInputFormat.setInputPaths(job, new Path("/input.txt"));
        TextOutputFormat.setOutputPath(job, new Path("/output"));

        job.submit();

}

But I get the following exception

org.apache.hadoop.security.AccessControlException: Permission denied: user=manthosh, access=EXECUTE, inode="/tmp":hduser:supergroup:drwxrwx---

The solution here doesn't work.

What am I missing?

Community
  • 1
  • 1
manthosh
  • 165
  • 1
  • 1
  • 14

0 Answers0