1

When I profile my C# application in Visual Studio 2010, in Line View, the 2nd highest time consuming function is listed as System.Windows.Forms.Application.DoEvents(). 7th on the list is System.Windows.Forms.Form.ShowDialog(). These two consume about 8% and 2.5% of the total exclusive samples.

The program does not have much user interaction. The user clicks a button and the application starts and runs its algorithms for about a minute, and then stops. During that period, there is no user interaction, however, there's heavy cpu and IO use.

I am not sure I understand why the above two functions (DoEvents and ShowDialog) capture so many exclusive samples. Is there anything that can be done for these two?

EDIT for Clarification: The application has 4 different threads. One thread reads data from an external device and places it in a queue. Another thread, reads data from the queue, and performs data manipulation. This cpu intensive thread places the manipulated data into another queue. The 3rd thread reads this queue and writes the data to disk in regular intervals. All threads are implemented as a backgroundWorker. The final (4th) thread is the applications main Form() itself. It actually is inactive during the whole process.

SomethingBetter
  • 1,294
  • 3
  • 16
  • 32
  • Paint and windows update events can be handled on the DoEvents call. Windows painting is pretty slow but it should not be anything to worry about. – CodingBarfield Feb 07 '11 at 17:21

3 Answers3

1

System.Windows.Forms.Application.DoEvents() is almost like saying "Do all GUI logic". Using DoEvents is not recomended and can even be considered dangerous, because it introduces race conditions and unspecified behaviour in many GUI cases.

Euphoric
  • 12,645
  • 1
  • 30
  • 44
  • I do not explicitly call Application.DoEvents() anywhere in my code. – SomethingBetter Feb 07 '11 at 16:36
  • I'll say its dangerous. Best example of this is a windows timer that ticks doesn't disable itself, then performs an operation that takes longer than the timer tick and then calls Application.DoEvents. That eventually results in a stackoverflow! (real-world story too!) – Quibblesome Feb 07 '11 at 19:05
1

I assume what you actually want to know is - what's making the app slow, right? If the question is only for curiosity, forget this answer.

While it's taking its time, just hit the pause button, and then examine the stack in each thread.

Do it a few times. You will see exactly what the problem is.

8% and 2.5% exclusive time is pure useless hoo-haw. Some call (not function, function call) ** in your code is on one of those threads' stack a large percent of the time. That's your bottleneck, and you will see it.

That is the random-pausing technique, and it just works.

** Sometimes this point is missed. The difference between a function and a function call is like the difference between a suitcase and the hand that holds the handle. The bottleneck isn't a function, it's a line of code that calls a function (even if only microcode). Other lines of code calling that function may not be bottlenecks.

Community
  • 1
  • 1
Mike Dunlavey
  • 40,059
  • 14
  • 91
  • 135
  • Interesting idea, however not applicable to our application. The reason is that we are reading data from an external device while the application is running. When we hit pause, we start missing incoming data from the external device (no way to "pause" the external device), which causes the integrity of the datastream to fail, i.e., the data becomes meaningless and the application can not continue any further. Thanks though, the link was interesting. – SomethingBetter Feb 08 '11 at 10:34
  • 1
    @SomethingBetter: You're welcome. Actually, I think you can do it. It's just that each pause is "sacrificial". You start the whole thing running normally, interrupt it, and record the stack. Then you kill it and start all over again. It's like an associate of mine who was studying the time-course of an anesthetic in rats. He would inject the medicine, wait a bit, decapitate and flash-freeze the rat, and then see where the medicine had gone. Made him a vegan :) Anyway, that's the method. – Mike Dunlavey Feb 08 '11 at 12:48
  • @SomethingBetter: It it wasn't clear, you don't need many samples. Initial problems often cost 50% or more, which is the chance of seeing it each time, and is how much time you can save by fixing it. Another way of saying it: if you take n=3 samples, and see something you can fix on s=2 of them, fixing that thing saves you an expected time of (s+1)/(n+2)=60%. (Expected in the statistical sense.) – Mike Dunlavey Feb 08 '11 at 13:05
  • @Mike: Yes, I guess I could (start-pause-examinestack-kill)! the entire application. Our expectation, by observing the profile data, n and s in your examples will be, n=~20, s~2, ie, operations that consume a lot of cycles do not consume more than 10% of all cpu cycles. It used to be much worse, as you point our in your comment. Before we started optimizing, 2 functions, in fact, about 5-6 lines of code consumed %40 of cpu cycles. This method would have worked quite well. But now the hot blocks are better spread out. So, taking 3 samples will probably not help, but it is worth a try. – SomethingBetter Feb 08 '11 at 16:02
  • @SomethingBetter: The thing is, that 8% of time in DoEvents is *exclusive*, meaning the program counter is in that routine. That means it's processing a *lot* of events. Many of those events might well be superfluous. Also, the VS sampler is *blind* to I/O time, so it won't see excess I/O no matter how huge it is. What's more, each thread traces out a call tree, and exclusive sampling only tells about the leaves, not about the unnecesary branches, or fruit(I/O). Pausing suffers none of these problems. If it can't find it, nothing will. – Mike Dunlavey Feb 08 '11 at 18:11
  • @Mike: Very interesting points. Let me ask you something. If I have a BinaryWriter, connected to a FileStream, and I do a BinaryWriter.Write(), which a large array as the source, will VS sampler not see the IO time spent on this routine? – SomethingBetter Feb 09 '11 at 08:31
  • @SomethingBetter: Correct, because in sampling mode it suspends sampling whenever the process itself is suspended, such as when waiting for I/O completion. This concept had its origin in the first profiler `prof` which only sampled the program counter, which is of course meaningless when the program's blocked. – Mike Dunlavey Feb 09 '11 at 13:15
0

DoEvents is used to process all the messages in the Windows message queue. You're better off using the TPL or Asynchronous processing to do long running tasks.

Furthermore, ShowDialog blocks until the form is closed. This question Is it possible to use ShowDialog without blocking all forms? explains this better than I can.

All these WinForms GUI events and processing is rather CPU intensive, and the profiler is making that very clear. You probably have nothing to worry about, providing long running tasks are done with threads.

Community
  • 1
  • 1
Tom
  • 3,354
  • 1
  • 22
  • 25