In my Java application, I need to execute some scripts as subprocesses and monitor the output on stdout from Java so that I can react when necessary to some output.
I am using apache commons-exec to spawn the subprocess and redirect stdout of the executed script to an input stream.
The problem that I am having is that when reading from the stream, the Java process is blocked until the subprocess is finished execution. I cannot wait until the end of the subprocess to react to the output, but I need to read it asynchronously as it becomes available.
Below is my Java code:
public class SubProcessReact {
public static class LogOutputStreamImpl extends LogOutputStream {
@Override
protected void processLine(String line, int logLevel) {
System.out.println("R: " + line);
}
}
public static void main (String[] args) throws IOException, InterruptedException {
CommandLine cl = CommandLine.parse("python printNumbers.py");
DefaultExecutor e = new DefaultExecutor();
ExecuteStreamHandler sh = new PumpStreamHandler(new LogOutputStreamImpl());
e.setStreamHandler(sh);
Thread th = new Thread(() -> {
try {
e.execute(cl);
} catch (IOException e1) {
e1.printStackTrace();
}
});
th.start();
}
}
For this example, the subprocess is a python script which counts upwards with a one second delay between outputs so that I can verify that the Java code is responding as data comes in.
Python Code:
import time
for x in range(0,10):
print x
time.sleep(1)
I would expect LogOutputStreamImpl to print each line as it comes, but what is actually happening is that it reading the stream blocks until the subprocess is completed, and then all of the output is printed.
Is there something I could do to make this work as I intend?