0

I encountered one issue where hadoop unjar command doesn't really generate temp jar dir in /mnt/hadoop/tmp/hadoop-${user.name}.
The context is that I want to get one shell script file from a jar file using XXX.class.getResourceAsStream("/my_shell.sh"). If I use java -cp command to run this jar it can executed as I expected.
But if I use hadoop jar command to run this jar I can't get what I want.
After debug I found that the getResource URL is like this url=file:/mnt/hadoop/tmp/hadoop-${user.name}/hadoop-unjar1355749400200754398/my_shell.sh But actually there is no hadoop-unjar1355749400200754398 dir generated at all in /mnt/hadoop/tmp/hadoop-${user.name}/.
Also there is no any error message printed. I have searched many docs but didn't find any like this issue.

Note: This main in jar doesn't submit any hadoop job and it only try to get a shell file from jar.
hadoop version is 0.20.2-cdh3u3.

FazoM
  • 4,777
  • 6
  • 43
  • 61
  • Try getting the jar location from inside your job and may be print it. For that use thie property `mapred.jar` like `System.out.println(conf.get("mapred.jar"));` – SSaikia_JtheRocker Aug 15 '13 at 05:12
  • Thanks for your reply. conf.get("mapred.jar") is null and there is no default value for it.http://wiki.apache.org/hadoop/JobConfFile – Hanbing Luo Aug 15 '13 at 06:17

0 Answers0