I need to create software that needs to poll hardware device.
static void Main(string[] args)
{
while(true)
{
DoSomething();
Task.Delay(1).Wait();
}
}
static void DoSomething(){}
I noticed that if I don't add even a smallest delay then CPU usage goes to 30-40%, but even with 1ms delay the usage remains around couple of percent.
My development environment is .net/c#.
For me and business it doesn't feel necessary to add 1ms delay, but it seems to make a world of a difference.
It feels like it is a good idea to add even a tiny delay. Why?
EDIT:
Given that piece of code above, with empty DoSomething()
, why adding Task.Delay(1).Wait();
draws down CPU usage so much?
It feels such a trivial thing. How come it has such an impact?