I have a desktop app that I'm in the process of porting to a Django webapp. The app has some quite computationally intensive parts (using numpy, scipy and pandas, among other libraries). Obviously importing the computationally intensive code into the webapp and running it isn't a great idea, as this will force the client to wait for a response.
Therefore, you'd have to farm these tasks out to a background process that notifies the client (via AJAX, I guess) and/or stores the results in the database when it's complete.
You also don't want all these tasks running in simultaneously in the case of multiple concurrent users, since that is a great way to bring your server to its knees even with a small number of concurrent requests. Ideally, you want each instance of your webapp to put its tasks into a job queue, that then automagically runs them in an optimal way (based on number of cores, available memory, etc.).
Are there any good Python libraries to help resolve this sort of an issue? Are there general strategies that people use in these kinds of situations? Or is this just a matter of choosing a good batch scheduler and spawning a new Python interpreter for each process?