0

I have a python script which is continuously writing a text stream to stdout. Something like this (genstream.py):

   while 1:
    print (int(time.time()))
    time.sleep(1)

I want a bash script which launch the python script, save its output to a set of files, let's say to split the output every hour to avoid the creation of a huge file which is difficult to manage.

The so created files will be then processed (i.e. one at the end of each hour) by the same bash script to insert the values into a database and moved to an archive folder.

I did my search in google/stack overflow (e.g. split STDIN to multiple files (and compress them if possible) Bash reading STDOUT stream in real-time or https://unix.stackexchange.com/questions/26175/ ) but I didn't find any solution so far.

I've tried to use also something easy like this (so without taking in account the time but only the number of lines)

python3 ./genstream.py | split -l5 -

but I have no output.

I've tried a combination of (named-)pipes and tee but nothing seems to work.

Alex
  • 9
  • 1

1 Answers1

0

Try this:

python3 ./genstream.py | while read line; do
  echo "$line" >> split_$(date +%Y-%m-%d-%H)
done
ceving
  • 21,900
  • 13
  • 104
  • 178
  • It doesn't work. There is no output whatsoever. FYI, also this code doesn't output anything: python3 ./genstream.py | while read line; do echo "$line" done – Alex Apr 23 '19 at 09:01