5

I'd like to use OBS to stream via RTMP to a nginx server, and then locally send the RTMP fragments to WebRTC, so that they can be transmitted to the client via a MediaStream. I think this possible as it is essentially described here. I'm doing this because the multi-second latency of HLS is not appropriate for what I'm trying to do.

I'm having trouble extracting the RTMP fragments from nginx, the only plausible command I could find for doing this in the documentation was pull rtmp://.... When I tried this I did not see any files appearing in my root folder, where I would normally find the HLS files if I were using hls on. Does anyone know how to accomplish what I'm trying achieve above?

Thanks!

wisdomtoothman
  • 123
  • 2
  • 6

2 Answers2

7

This is easily possible! You could base it off Pion’s rtp-to-webrtc example. This allows you to easily get media from ffmpeg into the browser.

The ffmpeg command you run instead would be like this one ffmpeg -re -i rtmp://localhost:1935/$app/$name -vn -acodec libopus -f rtp rtp://localhost:6000 -vcodec copy -an -f rtp rtp:localhost:5000 -sdp_file video.sdp

I would consider transcoding to VP8 since not all browsers support H264.

—-

If you want sub-second playback in the browser I would check out Project Lightspeed that’s your best option today IMO.

Sean DuBois
  • 3,972
  • 1
  • 11
  • 22
  • Thanks a lot for the help! I was also wondering: 1) Why is it necessary to convert the tcp/rtmp to a udp/rtp stream? Is there no way to directly listen to the tcp/rtmp port from within Go? 2) Do we have to use Go to do the port listening? [Others](https://stackoverflow.com/questions/47381258/javascript-listen-on-a-port) seem to suggest it's not possible to easily listen to UDP/TCP ports using javascript only. – wisdomtoothman Feb 14 '21 at 22:17
  • 1) You could totally do that! I don't have a ready made example of in memory RTMP -> WebRTC, it could easily be done though. If you do build please share I am sure others would find it helpful 2.) Yes you could totally use nodejs! I think your best option would be werift followed by wrtc. You will need to run some sort of 'WebRTC bridge` on a server though. – Sean DuBois Feb 15 '21 at 06:53
  • @SeanDuBois it seems that Project Lightspeed usees the FTL protocol. Is there any way we can publish to Lightspeed using ffmpeg or gstreamer? Thank you! – AFortunato Jun 19 '21 at 11:06
  • What are you trying to do? If you can use ffmpeg or GStreamer RTP -> WebRTC (or even WebRTC client -> SFU) -- FTL is kind of esoteric but is the only sub-second protocol supported by OBS – Sean DuBois Jun 20 '21 at 15:26
  • I'm trying to send video streamed with RTMP (from a mobile phone to an NGINX server) to a browser using RTMP. After posting this, I found example-webrtc-applications repo with the RTMP->WebRTC, but I'm facing a weird issue, can you please take a look? https://stackoverflow.com/questions/68059045/webrtc-video-not-showing. Thank you :) – AFortunato Jun 21 '21 at 20:01
1

There is really better and simple solutions for low latency streaming, either covert RTMP to WebRTC, or HTTP-FLV and HTTP-TS.

Convert RTMP to WebRTC

The most easy to do this, is to use a RTMP server such as SRS, which also support WebRTC player, it works like this:

OBS/FFmpeg ---RTMP---> SRS ---WebRTC--> Browser(Chrome)

Note: This solution allows you make little change, and also allows you to deliver both HLS for some devices which does not support WebRTC such as mobile brower.

You can follow the wiki to setup the demostration, I'm sure that you can finish it without 1 minute, because it's very easy without any other dependencies.

First, run SRS server by docker:

docker run --rm -it -p 1935:1935 -p 1985:1985 -p 8080:8080 \
    --env CANDIDATE="192.168.1.10" -p 8000:8000/udp \
    ossrs/srs:5 ./objs/srs -c conf/rtmp2rtc.conf

Note: Please set the CANDIDATE to the IP that your browser can access it, because WebRTC signaling and media use seperate transport channel.

Then, use OBS or other encoder to publish a RTMP stream:

docker run --rm -it ossrs/srs:encoder \
  ffmpeg -stream_loop -1 -re -i doc/source.flv \
    -c copy -f flv rtmp://192.168.1.10/live/livestream

Note: Please replace the IP to the correct one.

Please open the H5 player to play the WebRTC stream:

Please note that this is not the only solution for low latency live streaming, I will describe others in bellow.

Low Latency Live Streaming

For low latency live streaming, besides WebRTC, there are another solutions, use HTTP-FLV or HTTP-TS. The protocol is very simple and the latency is about 1~3s.

Note: Neither WebRTC nor HTTP-TS is supported by general CDN, but there are some CDNs support it such as TencentCloud, and there will be more in future.

Note: There is some issues to do live streaming by WebRTC, please see this post

First, run SRS server by docker:

docker run --rm -it -p 1935:1935 -p 1985:1985 -p 8080:8080 \
    ossrs/srs:5 ./objs/srs -c conf/srs.conf

Then, use OBS or other encoder to publish a RTMP stream:

docker run --rm -it ossrs/srs:encoder \
  ffmpeg -stream_loop -1 -re -i doc/source.flv \
    -c copy -f flv rtmp://192.168.1.10/live/livestream

Note: Please replace the IP to the correct one.

Please open the H5 player to play the HTTP-FLV stream:

Note: We only demostrate the HTTP-FLV, you can use HTTP-TS for better compatiblity.

There are also some new features for live streaming, for example, HEVC now is available on both Safari and Chrome 107+, you can do HEVC low latency live streaming by:

OBS/FFmpeg --SRT---> SRS ---HTTP-TS/HLS--> Safari/Chrome/VLC/ffpay

Note: You can find the HEVC low latency demo at here

I think WebRTC is not the same thing as live streaming, and live streaming never die, so even RTMP will be used in a long period.

The new protocol for live streaming is not only WebRTC, but:

  • SRT or RIST: Used to publish live streaming to live streaming server or platform.
  • WebRTC: To publish live stream by H5 web page.
  • HLS/HTTP-TS/WebRTC: To play the live streaming.

It works like this:

                                  +-------------+
OBS/FFmpeg/Encoder ---RTMP/SRT--> + Live Server +----RTMP/SRT--> Tool
H5/Chrome/Safari  ------WebRTC--> + or Platform +----HLS ---> Viewer
                                  +-------------+----HTTP-TS/WebRTC-->Viewer

For some use scenario, for example, if you want to transcode the live stream, or add a logo, or recreate it by OBS or other tools, RTMP/SRT is the best protocol.

To play live stream on almost all devices and CDN, HLS is the best solution, and you can use smaller gop to get about 5s latency, see this post.

If you want to do more realtime live stream, you can use HTTP-TS or HTTP-FLV, the latency is about 1~3s, see this post

WebRTC is also avaiable to play live stream, use a SFU to covert RTMP to WebRTC, such as SRS, for example for cloud gaming, please see this post

Winlin
  • 1,136
  • 6
  • 25