I was experiencing a ~5 sec
delay when playing a rtsp stream from an IP camera. After a bunch of googling (especially this question), I reduced the delay to ~1 sec
using the following command:
ffplay -fflags nobuffer -flags low_delay -framedrop -strict experimental \
-probesize 32 -sync ext rtsp://xxx.xxx.xxx.xxx
But when I tried mplayer -benchmark
command from the same question, I found the delay immediately goes away (i.e. almost 0 delay).
In the man page of mplayer
, it sais:
-benchmark
Prints some statistics on CPU usage and dropped frames at the end of playback. Use in combination with -nosound and -vo null for benchmarking only the video codec.
NOTE: With this option MPlayer will also ignore frame duration when playing only video (you can think of that as infinite fps).
I feel this "ignore frame duration" is the key to the question, but after a bunch of googling, I didn't find any flag related to this in ffmpeg
. I'm wondering how to force ignore input frame duration in ffmpeg
?
On the other hand, the reason I'm using ffmpeg
is because I need to do image processing using opencv
, while I found it seems to be using some part of ffmpeg
under-the-hood when doing
cv.VideoCapture('rtsp://xxx.xxx.xxx.xxx')
A solution that directly solves the problem in opencv
would be even more appreciated. I did try reading the VideoCapture
repeatedly in a separate thread, that didn't help.
Some info about the RTSP stream: h264, 1920x1080, 15fps, 1 key frame per 4s
Some other solutions I tried:
ffmpeg -r 99999 -i ...
# didn't work
mplayer ... -dumpstream
# it core dumped