I have a simple shell script which looks like this:
R --vanilla<myMRjob.R
hadoop fs -get /output_03/ /home/user/Desktop/hdfs_output/
This shell script runs myMRjob.R, and gets the output from hdfs to local file system. It executes fine from terminal.
When i am trying to run shell script from java code, i am unable to launch the MapReduce job i.e. the first line isn't getting executed. While "hadoop fs -get .." line is running fine through Java code.
Java code which i used is:
import java.io.*;
public class Dtry {
public static void main(String[] args) {
File wd = new File("/home/dipesh/");
System.out.println("Working Directory: " +wd);
Process proc = null;
try {
proc = Runtime.getRuntime().exec("./Recomm.sh", null, wd);
} catch (Exception e) {
e.printStackTrace();
}
}
}
The reason behind this whole exercise is that i want to trigger and display the result of the myMRjob.R in JSP.
Please help!