1

I have action that returns only an JSON string with information.

To prevent users edit JS code which updates information every 2 seconds I also need an server side delay to prevent high CPU load.

How safe is it to use Task.Delay(2000) if there are (for example) 2000-5000 users doing same request at the same time? Information returned is different for each user.

Patrick Hofman
  • 153,850
  • 22
  • 249
  • 325
AlekPsk
  • 37
  • 1
  • 7
  • I don't totally understand what do you want to achieve, but why not to use some sort of queue and execute operation by timer? – Konstantin Peshekhonov May 18 '16 at 07:32
  • @KonstantinPeshekhonov IIS already uses queues, supports throttling *and* caching of results. There is no need to reinvent them – Panagiotis Kanavos May 18 '16 at 07:40
  • @AlekPsk what you describe is not a high request load. HTTP, IIS and ASP.NET all have ways to handle caching responses. You can use output caching for individual user responses, eg with a small timeout of 2 seconds. You can specify the proper headers so that IIS itself will return a cached response if appropriate (eg ETag, Expires). You can modify your client so that it requests only updates versions with the `If-Modified-Since` header or `If-None-Match` and the last ETag – Panagiotis Kanavos May 18 '16 at 07:47
  • 1
    You should also consider using SignalR to *push* changes to the clients when they occur instead of having the clients poll the server for changes. SignalR will use the best available notification mechanism for each client (eg. WebSockets, infinite frame, long polling). Even in the worst case, it will be a lot more scaleable than hand-writing the same code – Panagiotis Kanavos May 18 '16 at 07:50

4 Answers4

2

Task.Delay is totally safe to use since it doesn't involve creating or blocking threads or and it doesn't stall the CPU.

On the other hand, it is not going to help you, since it is still possible to do multiple requests from one machine. Delaying the execution without further checks is a useless way to throttle requests.

Patrick Hofman
  • 153,850
  • 22
  • 249
  • 325
2

Why do you think adding Task.Delay(2000) will reduce the CPU load? If you have a high CPU load at T, adding Task.Delay(2000) only postpones the high CPU load to T+2, which is totally helpless.

A quick solution is checking the submit frequency on the UI side, like on a web page, disable the submit button and enable it again after a few seconds. But this can be cheated since the front-end scripts can be modified.

A safer solution is checking the submit frequency on the server side, you record the last submit time somewhere (e.g. a static variable, the simplest), and reject invalid requests.

Cheng Chen
  • 42,509
  • 16
  • 113
  • 174
2

Beside the other answers that are correct in asp.net if the user use the asp.net session, there is an issue that you must know with that because the asp.net session is lock the entire site until the call returns.

So if you use that Delay, with session, you block all users... Please read about :

Does ASP.NET Web Forms prevent a double click submission?
Web app blocked while processing another web app on sharing same session
What perfmon counters are useful for identifying ASP.NET bottlenecks?
Replacing ASP.Net's session entirely
Trying to make Web Method Asynchronous

Community
  • 1
  • 1
Aristos
  • 66,005
  • 16
  • 114
  • 150
1

If it goes about Task.Delay, then yes, this is fine. Task.Delay results in a task with a timer that will continue with that task when it's done (in the callback of timer). Given the way it works, that doesn't block your thread and doesn't execute on another thread, so it seems to be fine. The number of requests you posted also doesn't sound big.

It is true, however, that my answer is more about using Task.Delay in ASP.NET MVC than in your particular scenario, which you would need to describe in details if you need more specific answer.

Jakub Szumiato
  • 1,318
  • 1
  • 14
  • 19