2

Background:

  • I'm streaming video from a Bosch VideoJet x40 network video encoder. It reads in up to 4 analog camera inputs and outputs RTP streams.
  • I'm supplying test footage from an old VCR (Saving Private Ryan!), using a single yellow video composite cable to connect it to the network encoder's input.
  • These RTP streams are sent as H.264 encoded video on a UDP/RTP packet. The encoded video data in the YUV I420 color space.

My gstreamer pipeline reads RTP packets, converts them from YUV to raw RGB, and then saves them. So far it works correctly in that regard:

gst-launch --gst-debug=2 -vv udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264" port=33314 ! rtph264depay ! h264parse ! ffdec_h264 ! ffmpegcolorspace ! "video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false" ! filesink location="privateryan.rgb"

Here is the debugging output of the command which works mostly correctly and saves the RAW RGB video output. This file is gigantic (as expected) after only a few seconds so be careful if you run this command:

Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)704, height=(int)240, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264, width=(int)704, height=(int)240, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:src: caps = video/x-raw-yuv, width=(int)704, height=(int)240, framerate=(fraction)25/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)5/11
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:src: caps = video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false, width=(int)704, height=(int)240, pixel-aspect-ratio=(fraction)5/11, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:sink: caps = video/x-raw-yuv, width=(int)704, height=(int)240, framerate=(fraction)25/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)5/11
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false, width=(int)704, height=(int)240, pixel-aspect-ratio=(fraction)5/11, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false, width=(int)704, height=(int)240, pixel-aspect-ratio=(fraction)5/11, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false, width=(int)704, height=(int)240, pixel-aspect-ratio=(fraction)5/11, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321

After recording, I use the Vooya raw sequence player to inspect the files and play them back. However, the file does not play correctly without telling Vooya that the video is interlaced.

I need planar packaged frames that I can extract later for computer vision applications.

Here you can see the video playing, albeit with the wrong video format (interlaced):

https://i.stack.imgur.com/kXhGm.png

and here you can see the video not playing when I change the settings to what I need:

https://i.stack.imgur.com/RfJyv.png

So, I've tried to add the deinterlace plugin to my pipeline, but with no success. What could I be doing wrong?

Here's my new pipeline, with deinterlace right before the filesink:

gst-launch --gst-debug=1 -v udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,frame-rate=(fraction)25/1" port=33314 ! rtph264depay ! ffdec_h264 ! ffmpegcolorspace ! "video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false" ! deinterlace ! filesink location="privateryan.rgb"
0:00:00.096569700 12969       0x607080 ERROR           GST_PIPELINE ./grammar.y:614:gst_parse_perform_link: could not link ffmpegcsp0 to deinterlace0
WARNING: erroneous pipeline: could not link ffmpegcsp0 to deinterlace0

Why does my video still appear interlaced after all that processing, and what could I be doing incorrectly with regard to the interlacing plugin?

I think the VCR or B&W camera may be interlacing the video on their end, but I'm not certain. Even if they are, I can't change that and still need to deinterlace.

Thanks!

Cinco
  • 21
  • 1
  • 2
  • have you tested with an alternative player? – umläute Mar 28 '13 at 13:33
  • Yes, in VLC it appears correctly as well as with xvimagesink. My current theory is that the interlaced frames are being split in two and the frame rate doubled, i.e. the all top fields in one frame, all bottom fields in the next, and then the player de-interlacing by weaving them together. I can't prove it yet but does it sound plausible? – Cinco Mar 29 '13 at 13:42
  • easy proof would be to read the file yourself (rather than via a media-player) and check it's content. since it's uncompressed RGB, reading should be fairly trivial – umläute Mar 30 '13 at 07:27

0 Answers0