Our server has a lot if CPUs, and some web requests could be faster if request handlers would do some parallel processing.
Example: Some work needs to be done on N (about 1-20) pictures, to severe one web request.
Caching or doing the stuff before the request comes in is not possible.
What can be done to use several CPUs of the hardware:
- threads: I don't like them
- multiprocessing: Every request needs to start N processes. Many CPU cycles will be lost for starting a new process and importing libraries.
- special (hand made) service, which has N processes ready for processing
- cellery (rabbitMQ): I don't know how big the communication overhead is...
- Other solution?
Platform: Django (Python)