0

I have the task to stream an IP camera's video stream (RTP/RTSP in h264) via J2EE application server to a browser. For this I am using GStreamer 1.21.3 (latest dev release) with the gstreamer-java library on top. We are aiming towards a Websocket solution as the traditional HLS introduces significant latency. After having figured out what to do with the gst-launch executable on the commandline, I ended up with this code (for the moment):

/*
 * Configuration for RTSP over TCP to WebSocket:
 *  1. rtspsrc to ip camera
 *  2. rtph264depay ! h246parse to extract the h264 content
 *  3. mp4mux to create fragmented MP4
 *  4. appsink to grab the frames and use them in Websocket server
 */
final String gstPipeline = String.format("rtspsrc onvif-mode=true protocols=tcp user-id=%s user-pw=%s location=%s latency=200"
        + " ! rtph264depay ! h264parse"
        + " ! mp4mux streamable=true fragment-duration=5000"
        + " ! appsink name=sink", USERNAME, PASSWORD, uri);
final Pipeline pipeline = initGStreamerPipeline(gstPipeline);
// Add listener to consume the incoming data
final AppSink sink = (AppSink) pipeline.getElementByName("sink");
sink.setCaps(Caps.anyCaps());
sink.set("emit-signals", true);
sink.set("max-buffers", 50);
sink.connect((NEW_SAMPLE) appsink -> {
    final Sample sample = appsink.pullSample();
    if (sample == null)
    {
        return FlowReturn.OK;
    }

    final Buffer buffer = sample.getBuffer();
    try
    {
        final ByteBuffer buf = buffer.map(false);
        LOGGER.debug("Unicast HTTP/TCP message received: {}", new String(Hex.encodeHex(buf, true)));
        if (session != null)
        {
            try
            {
                buf.flip();
                session.getRemote().sendBytes(buf);
            }
            catch (final Exception e)
            {
                LOGGER.error("Failed to send data via WebSocket", e);
            }
        }
    }
    finally
    {
        buffer.unmap();
    }

    return FlowReturn.OK;
});
sink.connect((AppSink.EOS) s -> LOGGER.info("Appsink is EOS"));
sink.connect((AppSink.NEW_PREROLL) s -> {
    LOGGER.info("Appsink NEW_PREROLL");
    return FlowReturn.OK;
});

LOGGER.info("Connecting to {}", uri);

/**
 * Start the pipeline. Attach a bus listener to call Gst.quit on EOS or error.
 */
pipeline.getBus().connect((Bus.ERROR) ((source, code, message) -> {
    LOGGER.info(message);
    Gst.quit();
}));
pipeline.getBus().connect((Bus.EOS) (source) -> Gst.quit());
pipeline.play();

/**
 * Wait until Gst.quit() called.
 */
LOGGER.info("Starting to consume media stream...");
Gst.main();
pipeline.stop();
server.stop();

Now I seem to be stuck here, because the AppSink at the end of the pipeline never gets its new_sample signal triggered. The complete example works like a charme when I replace the appsink with a filesink. I have noticed that there are some other threads (like this one) with similar problems which normally boil down to "you forgot to set emit-signals=true". Any ideas why my appsink gets no data?

Update:

It appears that the problem is the URL I am passing to the pipeline string. It has two query parameters: http://192.168.xx.xx:544/streaming?video=0&meta=1. If I remove the second parameter (and the ambersand along with it), the pipeline works. Unfortunately I found no docs how to escape URLs in the correct way so GStreamer can read it. Can anyone share such documentation?

Update 2:

It starts getting weired now: It looks like the name of the URL parameter is the problem. I have started to replace it with some dummy argument and it works. So the ambersand is not the problem. Then I used VLC media player to consume the stream with the &meta=1 in place which also worked. Is it possible that the string "meta" is treated special in gstreamer?

Sebastian Götz
  • 459
  • 7
  • 15

0 Answers0