0

I spent a lot of time trying to find answer, but I've failed. I have an IP Camera with RTSP access (h.264). If I open the stream from OpenCV it tries to decode (as I understand) via ffmpeg and currupts frames.

[h264 @ 0x1821f20] error while decoding MB 3 35, bytestream -14

As I understood, I should use gstreamer for hardware acceleration (with NVIDIA GPU), but I can't find a real solution. What I should do?

  1. run gstreamer from console and get frames from buffer (RAM?) via OpenCV app?
  2. do something in OpenCV app?
  3. re-compile OpenCV with gstreamer and without ffmpeg?
chrns
  • 111
  • 1
  • 2

1 Answers1

0

Refer to the playback example for gstreamer1.0 examples in http://developer.download.nvidia.com/embedded/L4T/r21_Release_v3.0/L4T_Jetson_TK1_Multimedia_User_Guide_V2.1.pdf.

gst-launch-1.0 filesrc location=<filename.mp4> ! qtdemux name=demux ! h264parse ! omxh264dec ! nveglglessink –e

Instead of a file source you probably want to use the rtspsrc element. If you are lucky something like this may work:

gst-launch-1.0 rtspsrc location=<rtsp://url> ! decodebin ! omxh264dec ! nveglglessink –e

It may be that you need to manually insert rtph264depay element or others though.

If you are super lucky it may just work with something like this:

gst-launch-1.0 playbin uri=<rtsp://url> –e

Florian Zwoch
  • 6,764
  • 2
  • 12
  • 21
  • Okay, I will test it tomorrow, but I still have questions: a) should I run this command from terminal? How to get a frame from gstreamer buffer? b) Can I use CUDA after (I mean, using 192 graphics cores for computing) decoder usage? – chrns Feb 24 '16 at 19:04
  • This is a terminal command, yes. For access to the buffer you will have to write a GStreamer application. Look into `appsink` documentation on how to do it. Why wouldn't you be able to to do some more calculation on the CUDA cores at this point? – Florian Zwoch Feb 25 '16 at 08:52