I'm working on Gazebo Sim which uses 'Gstreamer Plugin' to stream Camera Video via UDP. Simulation is started on Ubuntu 18.04.
There is some resources to understand backend of this structer. Gazebo Simulation PX4 Guide
And they mentions how to create pipeline:
The video from Gazebo should then display in QGroundControl just as it would from a real camera.
It is also possible to view the video using the Gstreamer Pipeline. Simply enter the following terminal command:
gst-launch-1.0 -v udpsrc port=5600 caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' \
! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink fps-update-interval=1000 sync=false
And it works well on terminal. I read these questions:
using gstreamer with python opencv to capture live stream?
Write in Gstreamer pipeline from opencv in python
So then, I tried to implement this pipeline into opencv by using following lines:
video = cv2.VideoCapture('udpsrc port=5600 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink fps-update-interval=1000 sync=false', cv2.CAP_GSTREAMER)
#video.set(cv2.CAP_PROP_BUFFERSIZE,3)
# Exit if video not opened.
if not video.isOpened():
print("Could not open video")
sys.exit()
# Read first frame.
ok, frame = video.read()
if not ok:
print('Cannot read video file')
sys.exit()
But it's only giving error:
Could not open video
And I tried different variations of this pipeline in opencv but None of them helped me.