3

I am implementing a live broadcasting in my android app. I am using webRTC for real-time video chat. Now I would like to broadcast live chat to many people.I would like to use Mpeg-DASH.For that video stream can be sent to the server over RTMP and then will be broadcasted using Mpeg-DASH.

So I would like to know how to capture media stream of both local and remote user and then send it over RTMP. I have a working prototype for sending camera captured video to the server over RTMP. But I don't know how to send same media stream to the server which is being used by webRTC. Possible solutions

  1. Record/Capture screen of live chat and then send it to the server over RTMP.
  2. Make server a peer in webRTC and manipulate stream and broadcast it via Mpeg-DASH.

I would like to do it in client side.Is there any other way to do this? Thanks.

Uchit Kumar
  • 667
  • 10
  • 26

1 Answers1

0

You should use a WebRTC SFU to forward packets to app, and to covert WebRTC to RTMP. It works like this:

android app --WebRTC--> Server -+--WebRTC--> android app
                                |
                                +--RTMP--> live streaming platform
                                +--HLS/DASH--> player

Because WebRTC is always encrypted by DTLS, so you should covert the stream by a SFU server, rather than hacking the stream of client.

Please read more in this post.

Winlin
  • 1,136
  • 6
  • 25