1

I'd like to send and receive video frames via UDP or TCP using Gstreamer on Jetson TX1.

Seemingly, I can send video frames as below.

Sender:

gst-launch-1.0 nvcamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1024, height=(int)720, format=(string)I420, framerate=(fraction)30/1' !  nvvidconv flip-method=2 ! udpsink host=<BCast IP addr> auto-multicast=true port=8089

Receiver:

gst-launch-1.0 -v udpsrc port=8089 ! ... ! nvoverlaysink -e

UPDATE: This seemed fine, but, still black frames on receiver.

 gst-launch-1.0 -v udpsrc port=8089 ! capsfilter caps='video/x-raw(memory:NVMM),width=244,height=244, format=I420, framerate=20/1' ! nvoverlaysink -e

I don't know what filters needs to be added on "Receiver" side. (I tried "videoparse",but got error: "videoparse: event not found") Also, Is there a way to capture each video frame (image) using python script? Ultimately, I'd like to capture each frames in jpeg or png format using python script. I can test whether a receiver can get data from sender (video src) but, still have issues mentioned above.

import socket
import sys
import time

HOST = ''   # Symbolic name meaning all available interfaces
PORT = 8089 # Arbitrary non-privileged port

# Datagram (udp) socket
try :
    s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
    print 'Socket created'
except socket.error, msg :
    print 'Failed to create socket. Error Code : ' + str(msg[0]) + ' Message ' + msg[1]
    sys.exit()


# Bind socket to local host and port
try:
    s.bind((HOST, PORT))
except socket.error , msg:
    print 'Bind failed. Error Code : ' + str(msg[0]) + ' Message ' + msg[1]
    sys.exit()

numFrames = 0
while 1:
    # receive data from client (data, addr)
    d = s.recvfrom(4096)
    data = d[0]
    addr = d[1]

    if not data:
        break

    reply = 'OK...' + data

    s.sendto(reply , addr)
    print 'Message[' + addr[0] + ':' + str(addr[1]) + '] - ' + data.strip()
joshsuihn
  • 770
  • 1
  • 10
  • 25

1 Answers1

2

To answer the first of the questions, you need to include another gstreamer element rtpvrawpay to encode the packets into a payload suitable for udpstreaming. Since rtpvrawpay does not accept nvidia video, change the caps to force nvvidconv to convert it to normal video/x-raw.

gst-launch-1.0 -e nvcamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1024, height=(int)720, format=(string)I420, framerate=(fraction)30/1' !  nvvidconv flip-method=2 ! 'video/x-raw, width=(int)1024, height=(int)720, format=(string)I420, framerate=(fraction)30/1' ! rtpvrawpay ! udpsink host=<BCast IP addr> auto-multicast=true port=8089

That should give you a valid stream. To depay the payload try:

 gst-launch-1.0 -v udpsrc port=8089 ! rtpvrawdepay ! capsfilter caps='video/x-raw,width=244,height=244, format=I420, framerate=20/1' ! nvoverlaysink -e

Note that we are NOT taking the NVMM format, just standard video/x-raw. If you do

 gst-inspect-1.0 nvoverlaysink 

you will see it accepts NVMM or standard x-raw video.

Also see this answer.. Stream H.264 video over rtp using gstreamer And the ridgerun pipelines: https://developer.ridgerun.com/wiki/index.php?title=Gstreamer_pipelines_for_Tegra_X1

Note I added -e to your Gstreamer pipelines if you are running them from the command line otherwise Ctrl-C won't close the stream properly. Your second question is a major exercise. https://github.com/markw63/pythongst/blob/master/README.md is an example of the code that does this type of job using appsink. This code gets its video and audio from a camera, but could easily obtain the code from a udpsrc as above, connect to an appsink and then send messages and data on each buffer (a frame usually). It is possible to set up Gstreamer to split and capture any stream into individual jpegs (or whatever) by using the appsink (line 28 of the example) and post messages elements in a Gstreamer pipeline, with a message for each frame being passed on the DBUS (bus_signal_watch) that can then isolate frames and pass them. This version needs multi-threading to work well as shown, with two separate threads, one for the Gstreamer loop and one for the Gobject loop.

Community
  • 1
  • 1
Markw63
  • 71
  • 8