If needing to query a database with a resulting set of 100,000 rows plus. Where I then need to process that data. Can this be done, succesfully, in a continuous webjob? If so, how is the queue managed? I currently have this question
Which discusses a problem of using a continuous webjob with a time trigger. The queue is being dumped if the webjob restarts, by dumped I mean, the queue is not processed any further. If a take
to limit the rows in the query is used, the next pollevent does not process any data.
So much is managed under the hood with these webjobs, and there's some it's hard to get a good grasp of to manage a large queue.
My question:
Are webjobs suited to processing large amounts of data?
If so, should they be continuous or scheduled and why?