Please be kind, I'm just learning C# and inheriting this application from a former-employee is my first C# project.
I am observing inconsistent and slow periods with System.Windows.Forms.Timer. The application is written in C# with MS Visual Studio.
The timer is set for an interval of 100 msec yet I am observing periods ranging from 110 msec to 180 msec.
I am using several tools to observe this including: - a SW oscilloscope (the Iocomp.Instrumentation.Plotting.Plot package), - a real oscilloscope, - letting the timer run for some time and comparing the number of ticks * 100 msec to both the system time and to a stopwatch.
In all cases I am observing a 10% lag that becomes evident within the first few seconds.
The methods that are executed with each tick take fewer than 4 msec to run. There is no time-consuming asynchronous processing happening, either. This shouldn't matter, though, as the timer tick is an interrupt, not an event added to an event handler queue (as far as I know).
Has anyone experienced a problem like this before? What were the root causes?
Thanks.