For some reason I could not find an answer to the question below, probably because it is just too obvious.
During my experiments with perl dancer, I added a route, which just sleeps for 10 seconds and then returns something. I did this in order to simulate a long-running operation. I noticed, that during these 10 seconds, dancer would not serve any other request. I understand that this is because dancer is single-threaded.
Now this single-threaded approach is obviously not suitable for even mildly demanding applications. So I believe there must be a nuber of established solutions. But I just don't seem to know the right search strings to google for.
To make things clear: I don't mind, when the reqest which initialted the long running operation itself gets blocked. What I want is that all other requests are sill being served.
Could anybody please enlight me in terms of
- How do webservers traditionally handle long-running operations, without blocking other requests ?
- Will there be threads/processes for each session, or can threads/processes be spawned on-demand, in situations, where I know the operation will take a long time
- How is session information preserved when going multi-threaded, i.e. when a browser does not always talk to the same process?
- Any particulat recommendations concerning dancer (feel free to recommend an alternative to dancer)