I am trying to execute it via:
Process process = Runtime.getRuntime().exec(spark_cmd);
with no luck. The command ran via shell starts my application which succeeds. Running it via exec start a process which dies shortly after and does nothing. When I try
process.waitFor();
it hangs and waits forever. Real magic begins when I try to read something from the process:
InputStreamReader isr = new InputStreamReader(process.getErrorStream());
BufferedReader br = new BufferedReader(isr);
To do so I start a thread that reads from the stream in a while loop:
class ReadingThread extends Thread {
BufferedReader reader;
Wontekk(BufferedReader reader) {
this.reader = reader;
}
@Override
public void run() {
String line;
try {
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Application starts, does some stuff, and hangs. When I abort my application, spark application wakes up (??????????) and completes remaining work. Does anyone have reasonable explanation of what is happening?
thanks