I've implemented a UVC video viewing application using the source reader in async mode (OnReadSample()). The connected camera produces raw10 frames and can display just the raw images or perform additional processing (within OnReadSample() callback) and display the generated output as well (i.e., two viewers). The two images are displayed correctly with exception of a lag (i.e., camera to display) due the additional processing time being greater than the frame rate (1/FPS).
How does the Media Source handle an overrun scenario? My understanding (please correct if wrong) is new MFSamples (i.e. image containers) are created and queued, but I've yet to find info on what happens when the queue depth is reached.
Can the Media Source queue depth be set to a particular number?
Some additional system details:
- Win 10
- Direct3D9
Thanks, Steve.