0

I am developing a application (C# 3.5) which executes multiple parallel jobs. When I tried one or 2 jobs in parallel the CPU utilization is less like 2%. But when I run 50 jobs in parallel the CPU utilization is fine for some 20 minutes (less than 10%).

But it suddenly increased to 99% and PC hangs. I am using DB operation and LINQ operations. If you can give some idea, that can light me up in tuning my application.

And also is there any .NET tools which identifies potential CPU utilization code?

I know its weird to ask just like that which will cause more CPU utilization.

Edit:

For a single job the CPU utilization is not increasing. But it happens with multiple jobs only. I donno which causes more CPU utilization. Any help is appreciated.

jaks
  • 4,407
  • 9
  • 53
  • 68
  • "So bite me"? Ok, let's see how your rep looks tomorrow... – annakata Sep 27 '10 at 13:45
  • Please show a bit of decency. – ChaosPandion Sep 27 '10 at 13:45
  • Does the same thing happen after 20 minutes with a single job running?You could try a profiling tool like the trial for Ants Profiler, also go through the Windows Event Viewer to see what was logged when the PC hangs. – GenEric35 Sep 27 '10 at 13:48
  • Sorry guys. thanks for correcting. – jaks Sep 27 '10 at 13:51
  • 3
    Hold on a minute: *why is lots of CPU utilization bad*? I would say that the situation where you have fifty jobs running and *not a single one of them is actually doing anything* is more likely to be the *bad* situation. You bought a computer to *compute stuff* - if the job is only using 10% of the CPU then you bought ten times too much computer. Lots of CPU utilization is *good*, it indicates that the computer is doing what it was designed to do: compute stuff. – Eric Lippert Sep 27 '10 at 14:12

1 Answers1

2

Such a tool is called a profiler. You can see soem profiler recommendations at What Are Some Good .NET Profilers?

Community
  • 1
  • 1
Midhat
  • 17,454
  • 22
  • 87
  • 114