1

we´ve created a custom push source / parser filter that is expected to work in a directshow

editing services timeline.

Now everything is great except that the filter does not stop to deliver samples when the current

cut has reached it´s end. The rendering stops, but the downstream filter continues to consume

samples. The filter delivers samples until it reaches EOF. This causes high cpu load, so the application

is simply unusable.

After a lot of investigation I’m not able to find a suitable mechanism that can inform my filter

that the cut is over so the filter needs to be stopped :

  • The Deliver function on the connected decoder pins always returns S_OK, meaning the attached decoder

is also not aware the IMediaSamples are being discarded downstream

  • there’s no flushing in the filter graph

  • the IMediaSeeking::SetPositions interface is used but only the start positions are set –

    our is always instructed to play up to the end of the file.

    I would expect when using IAMTimelineSrc::SetMediaTimes(Start, Stop) from the application

    that this would set a stop time too, but this does not happen.

  • I’ve also tried to manipulate the XTL timeline adding ‘mstop’ attributes to all the clip in the

hope that this would imply a stop position being set, but to no avail

In the filters point of view, the output buffers are always available (as the IMediaSamples are being discarded downstream),

so the filter is filling samples as fast as it can until the source file is finished.

Is there any way the filter can detect when to stop or can we do anything from the application side ?

Many thanks

Tilo

TSkomudek
  • 11
  • 1

3 Answers3

0

I had a chance to work with DES and custom push source filter recently.

From my experience;

  • DES actually does return error code to Receive() call, which is in turn returned to Deliver() of the source, when the cut reaches the end.
  • I hit the similar situation that source does not receive it and continues to run to the end of the stream.
  • The problem I found (after a huge amount of ad-hoc trials) is that the source needs to call DeliverNewSegment() method at each restart after seek. DES seems to take incoming samples only after that notification. It looks like DES receives the samples as S_OK even without that notification, but it just throws away.
  • I don't see DES sets end time by IMediaSeeking::SetPositions, either.

I hope this helps, although this question was very old and I suppose Tilo does not care this any more...

0

You can try adding a custom interface to your filter and call a method externally from your client application. See this SO question for a bit more of details on this approach. You should be careful with thread safety while implementing this method, and it is indeed possible that there is a neater way of detecting that the capturing should be stopped.

Community
  • 1
  • 1
yms
  • 10,361
  • 3
  • 38
  • 68
0

I'm not that familiar with DES, but I have tried my demux filters in DES and the stop time was set correctly when there was a "stop=" tag for the clip.

Perhaps your demux does not implement IMediaSeeking correctly. Do you expose IMediaSeeking through the pins?

Geraint Davies
  • 2,847
  • 15
  • 9