0

I'm using Daemon to generate a daemon of a python script I have. However it seems that the logs are not written to file. The simple bash script I'm using (with adapted formatting here for readability):

if ! daemon --name atlas --running; then
    daemon --errlog /home/ubuntu/output.log 
    --dbglog /home/ubuntu/output.log 
    --output /home/ubuntu/output.log
    --stdout /home/ubuntu/output.log
    --stderr /home/ubuntu/output.log 
    --respawn 
    --name acme
    /home/ubuntu/acme.py
fi

It successfully starts the script and keeps it alive without issue. It also logs when the script is killed, but doesn't log any of the print statements in stdout.

When the script hits an exception it seems it dumps all the historical log data that should have printed out at once.

Tyler Mills
  • 353
  • 1
  • 3
  • 15

1 Answers1

1

You have issues with output buffer. Many programs are buffering output to gain some performance, but in some cases it postpones the output being visible in logs.

You will see something in the logs under following circumstances:

  • when amount of data in output buffer reaches some size and gets flushed
  • when the output is closed, e.g. when the process exits.

You shall somehow force your script to flush the output (e.g. printed to stdout) sooner. There are few options:

  • prevent usage of output buffer completely - not very efficient
  • allow output buffer only to one line. Each newline will then cause output to be flushed and become visible
  • log out using logging - sometime it forces buffering to what you need.
  • use some external program starting your script - some are using hacks controlling the output buffer.

You shall find some SO questions and answers regarding output buffer control. One of them being Disable output buffering, another one proposing usage of unbuffer How to make output of any shell command unbuffered?.

Community
  • 1
  • 1
Jan Vlcinsky
  • 42,725
  • 12
  • 101
  • 98