I'm running a small python program using Tornado, which collects multiple linux named pipes (FIFOs) outputs written by another program. Unfortunately not all the output from the pipes is received for some reason.
I add the pipes like so:
for pipe in pipe_files:
pipe_file = open(pipe, 'r')
try:
pipe_stream = PipeIOStream(pipe_file.fileno())
self.output_streams.append(pipe_stream)
except IOError:
logging.warn("Can't open pipe %s", pipe)
continue
self.read_lines(pipe_stream, self.new_output)
Read lines registers a callback like so:
def read_lines(self, stream, callback):
"""
Read lines forever from the given stream, calling the callback on each line.
:param stream: a tornado.BaseIOStream
:param callback: callback method to be called for each line.
"""
def wrapper(line):
if not self.output_streams:
# Output streams have been removed, no need to continue.
return
callback(line.strip())
# Reregister the callback, if the stream hasn't closed yet.
if not stream.closed():
stream.read_until(os.linesep, callback=wrapper)
stream.read_until(os.linesep, callback=wrapper)
I finally run the program with tornado's Subprocess (also capturing its stdout/err in the same way) and exiting when the subprocess ends.
I do not receive all of the expected output (for instance I'll print 10000 lines in the program but only receive ~7000 in the python program). When I simply used "cat" to get the fifo output, I could see it.
I ensured the program flushes the output correctly. I tried sleeping forever in the program to allow Tornado some time to get the output, but it had the same result.
Any ideas?