I have a long-running python function, executing as part of a http request handler (it's tied to various options selected on a form).
The http request handler waits for the function to complete execution, and then serves an appropriate response back to the originator.
However, since the function can take some time to execute, the request may, in that time, be interrupted - causing the request handler to die, and interrupting the execution of the long-running function. This is undesirable - I'd rather make sure that the function completes execution every time it is called.
I've played with using subprocesses, but they seem to die just as soon as the request handler does.
Can I/how do I fire the function in a separated thread, or process, indicating that it should be allowed to outlive its parent?
daemons seem to maybe partially solve my problem - they can outlive their parent? - but i'd still like the parent process to monitor/access the results of the child process.
That is:
- The parent process may die before the child one does. I'd like the child process to complete in this case
- As long as the parent process is still alive, i'd like for it to have access to the child process, to monitor it etc... If the parent process doesn't die, I'd like for it to have access to the results of the processing the child was doing.