I'm writing simple code with Hadoop in Java. When I trying to run it into jar file and running from CLI - it is working. But when I tring to do it with Tool interface nad ToolRunner I have an exception:
Exception in thread "main" java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:482)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:659)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:447)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:293)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:145)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1297)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1294)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1315)
at pl.flomedia.hadoop.Main.run(Main.java:52)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at pl.flomedia.hadoop.Main.main(Main.java:28)
And this is my code (config only):
public class Main extends Configured implements Tool {
//mvn clean package antrun:run@deploy
public static void main(String[] args) throws Exception {
int res = ToolRunner.run(new Configuration(), new Main(), args);
System.exit(res);
}
@Override
public int run(String[] args) throws Exception {
Configuration conf = this.getConf();
conf.set("mapred.job.tracker", "hadoop-master:8021");
conf.set("fs.default.name", "hdfs://hadoop-master:9000/user/vagrant");
conf.set("hadoop.job.ugi", "vagrant");
System.setProperty("HADOOP_USER_NAME", "vagrant");
FileSystem fs = FileSystem.get(conf);
System.out.println(fs.isDirectory(new Path("/user/vagrant")));
//fs.mkdirs(new Path("input"));
Job job = Job.getInstance(conf, "passent");
job.setJarByClass(Main.class);
job.setMapperClass(PasswordMapper.class);
job.setReducerClass(EntrophyPassReducer.class);
job.setSortComparatorClass(EntrophyDescComparator.class);
job.setOutputKeyClass(DoubleWritable.class);
job.setOutputValueClass(Text.class);
FileInputFormat.addInputPath(job, new Path("input"));
FileOutputFormat.setOutputPath(job, new Path("output"));
return job.waitForCompletion(true) ? 0 : 1;
}
}
Can someone help me? :) Thanks in advance!