0

I'm trying to use MediaFrameReader (based on this article) in order to receive a stream of frames from the camera and process them. However, I get a strange behavior: the camera is defined as 30 fps, so I would expects each frame to arrive at 33 milliseconds, right? instead, most of them actually arrive at 31 milliseconds, when every once in a while I get a slow frame that is about 45 milliseconds. Sure, it averges out to exactly 30 frames per seconds, but the frames themselves are inconsistent.

I understand that for most use cases it might not be a big deal, however, my processing require accurate frame speed. I could "flatten the curve" myself using a queue, of course, but this seems backwards. Also, I would suspect this might cause data corruption. Any idea how should I resolve this strange behavior?

1 Answers1

0

Accurate timing is not in general possible in windows, there might be garbage collections, preemption etc that pauses your process to do other things. You can minimize this, but for guaranteed timing you would need a real time OS.

I would suggest starting with some profiling to check for blockages of any kind. A delay of 10ms could be easily explained by a garbage collection. If so you might want to avoid or minimize allocations to reduce the number of collections.

It would also depend on what the actual goal is. If you just want a steady framerate buffering would be a perfectly appropriate strategy, and should not cause data corruption if done right. There is a multi media timer that can be used to get better resolution than the regular timers.

If you want to correlate frames with some kind of external event you might need a timestamp when the image was actually taken, or some kind of hardware trigger to correlate the frame with whatever event you want to monitor. Cameras intended for industrial automation usually have more timing features than general web-cameras.

JonasH
  • 28,608
  • 2
  • 10
  • 23