I have used a number of python webservers including the standard http.server, flask, tornado, dash, twisted, and cherryPi. I have also read up on django. Afaict none of these have anything remotely resembling true multi-threading. With django for example the recommendation is to use celery which is a completely separate queue based task manager. Yes we can always resort to external queueing: but that then says there is not anything native that is closer to multithreading (in process). I am very aware of the GIL but at least would look for sharing the same code - akin to fork for a c program.
One thought is to try to use the multiprocessing library. And in fact there is a Q&A on that approach with the accepted answer https://stackoverflow.com/a/28149481/1056563 . However that approach seems to be pure socket tcp/ip: it does not include the important Http handling support. That leaves way too much work to be re-implemented (including round objects such as the wheel).
Is there any way to merge the multiprocessing library approach with an available webserver library such as twisted , tornado, dash etc? Otherwise how do we use their useful http
handling capabilitiies?
Update We have a mix of workloads
- small/quick responses (sub millisecond cpu): e.g. a couple of RDBMS calls
- moderate compute (double digit milliscond cpu) : eg. encryption/decryption of audio files
- significant compute (hundreds of milliseconds to single digit seconds): e.g. signal processing of audio and image files
We do need to be able to leverage multiple cpu's on a given machine to concurrently handle the mix of tasks/workloads.