Update: I wrote a program to test the memory implications of each of the techniques I mention below. Not too surprisingly, I found that, sure enough, the conventional approach using .NET events creates a lot more garbage than the other approaches (meaning, it does actually create garbage, as opposed to the other two strategies both of which seem not to create any garbage at all).
Really, I should have stressed all along that I was more interested in the memory overhead of TEventArgs
arguments in .NET events than the cost in terms of speed. Ultimately I have to concede that, for all practical purposes—in terms of both memory and speed—the cost is negligible. Still, I thought it was interesting to see that raising a lot of events the "conventional" way does cost something—and that, in extreme cases, it can even lead to gen 1 garbage collections, which may or may not matter depending on the situation (the more "real-time" a system needs to be, in my experience, the more important it is to be mindful of where garbage is being created and how to minimize it where appropriate).
This might seem like a dumb question. I realize that Windows Forms, for instance, could easily be considered a "high-performance" scenario, with hundreds or even thousands of events being raised in very rapid succession (e.g., the Control.MouseMove
event) all the time. But still I wonder if it's really reasonable to design a class with .NET events when it is expected that the class will be used in high-performance, time-critical code.
The main concern I have is with the convention that one use something like EventHandler<TEventArgs>
for all events, where TEventArgs
derives from EventArgs
and is in all likelihood a class that must be intantiated every single time the event is raised/handled. (If it's just plain EventsArgs
, obviously, this is not the case as EventArgs.Empty
can be used; but assuming any meaningful and non-constant information is contained in the TEventArgs
type, instantiation will probably be needed.) It seems like this results in greater GC pressure than I would expect for a high-performance library to create.
That said, the only alternatives I can think of are:
- Using unconventional delegate types (i.e., not
EventHandler<TEventArgs>
) for events, taking only parameters that don't require object instantiation such asint
,double
, etc. (evenstring
, and passing references to existing string objects). - Skipping events altogether and using virtual methods, forcing client code to override them as desired. This seems to have basically the same effect as the previous idea but in a somewhat more controlled way.
Are my concerns about the GC pressure of .NET events unfounded to begin with? If so, what am I missing there? Or, is there some third alternative that is better than the two I've just listed?