1

I have a simple shell script which looks like this:

R --vanilla<myMRjob.R
hadoop fs -get /output_03/ /home/user/Desktop/hdfs_output/

This shell script runs myMRjob.R, and gets the output from hdfs to local file system. It executes fine from terminal.

When i am trying to run shell script from java code, i am unable to launch the MapReduce job i.e. the first line isn't getting executed. While "hadoop fs -get .." line is running fine through Java code.

Java code which i used is:

import java.io.*;

public class Dtry {

   public static void main(String[] args) {

       File wd = new File("/home/dipesh/");
       System.out.println("Working Directory: " +wd);
       Process proc = null;

       try {
           proc = Runtime.getRuntime().exec("./Recomm.sh", null, wd);
       } catch (Exception e) {
         e.printStackTrace();
         }
   }
}

The reason behind this whole exercise is that i want to trigger and display the result of the myMRjob.R in JSP.

Please help!

dipeshtech
  • 384
  • 1
  • 5
  • 16
  • what is the error are you getting? – havexz Feb 19 '12 at 07:54
  • I am not getting any error. myMRjob.R isn't getting executed. I see the terminal prompt just at next instant after hitting return. The job takes some time to run and write o/p to hdfs. I also checked that there is no o/p directory created in hdfs. – dipeshtech Feb 19 '12 at 08:04
  • 2
    @dipeshtech: use proc.getErrorStream() to get the errors into an input stream. Print it on the console....there should be error sumwer – Shashank Kadne Feb 19 '12 at 08:07
  • try http://stackoverflow.com/a/8963445/525978 to print error and output – havexz Feb 19 '12 at 08:19
  • Thanks Shashank, I didn't get any error but by adding the lines things started working. – dipeshtech Feb 19 '12 at 08:35
  • Yes, because you need I/O stream consumer for the job otherwise it won't work. PS: it is more efficient to use `Rscript --vanilla myMRjob.R` – Simon Urbanek Feb 19 '12 at 14:51

1 Answers1

0

The reason your shell script isn't running from the exec call is because shell scripts are really just text files and they aren't native executables. It is the shell (Bash) that knows how to interpret them. The exec call is expecting to find a native executable binary.

Adjust your Java like this in order to call the shell and have it run your script:

proc = Runtime.getRuntime().exec("/bin/bash Recomm.sh", null, wd);

When you called hadoop directly from Java, it is a native executable and that's why it worked.

EricD
  • 411
  • 3
  • 4