0

I don't know how to get started with this.

What I am trying to do is get a video + audio stream from front-end and host the live stream as mp4 thats accessible on browser.

I was able to find information on WebRTC, socket.io, rtmp, but I'm not really sure what tool to use / whats best suited for something like this?

also follow up question, my front-end is iOS app. So what format would I send the live stream to the server?

Sam KC
  • 61
  • 1
  • 8
  • It's my understanding that Apple requires usage of HLS for any live streaming in apps. Therefore, it's probably easier if you just encode the appropriate HLS segments from the source. Then, you don't need anything special server-side at all. – Brad Aug 18 '21 at 04:01

1 Answers1

0

It depends on which live streaming protocol you want to play on the player, as @Brad said, HLS is the most common protocol for player.

Note: Besides of HLS, iOS native app is able to use fijkplayer or FFmpeg to play any format of live streaming, like HLS, RTMP or HTTP-FLV, even MKV. However, the most straight forward solution is HLS, only need a tag to play MP4 or HLS, and MSE is also a optional solution to use flv.js/hls.js to play live streaming on iOS/Android/PC, this post is about these protocols.

The stream flow is like this:

FFmpeg/OBS ---RTMP--->--+ 
                        +--> Media Server---> HLS/HTTP-FLV---> Player
Browser ----WebRTC--->--+

The protocol to push to media server, or receive in node server, depends on your encoder, by RTMP or H5(WebRTC):

  • For RTMP, you could use FFmpeg or OBS to push stream to your media server.
  • If want to push stream by H5, the only way is use WebRTC.

The media server coverts the protocol from publisher to player, which use different protocols in live streaming right now(at 2022.01), please read more from this post.

Winlin
  • 1,136
  • 6
  • 25