1

I'm very new with ffmpeg. Consider the following case:

I have several onvif ip camera connected to the network with an IIS server inside it. I'd like to allow client to streaming to any of ip camera inside the network but it must through the IIS server.

So basically each of ip camera will stream to IIS server in single stream and IIS server will re-distribute to many client who request it. My question is how to setup iis server to works with this scenario? And an example of ffmpeg command line to read from rtsp ip camera and send it the iis server which will re-distribute it to client.

Buzz
  • 321
  • 2
  • 3
  • 20

1 Answers1

1

You can use HTTP live streaming for this scenario, either HLS or DASH. HTTP streaming adds some latency so you need to do a bit of research on how to tweak the encoding parameters for low-latency.

The basic idea is that you need to segment the incoming stream and make those segments and playlist/manifest available via your existing web server infrastructure.

Example for FFmpeg and HLS:

ffmpeg -i rtsp://input_stream.sdp -c:v libx264 -r 25 -g 25 -c:a libfdk_aac -hls_time 1 -hls_list_size 4 -hls_wrap 8 /path/to/webroot/live/playlist.m3u8

On the client you will then use the URL http://domain.com/live/playlist.m3u8. HLS in not supported natively on all devices so get a web player like JWplayer or clappr. The client needs 3 segments to start the playback.

FFmpeg HLS

For DASH the idea is similar but you also need to use MP4Box.

aergistal
  • 29,947
  • 5
  • 70
  • 92
  • Thank you. I'd try that. Anyway for the IIS, any setting should be done? Can the client just use html5 to play the live streaming? – Buzz Apr 23 '15 at 09:12
  • Like I said it the answer, HLS doesn't work natively across all devices. The linked players offer Flash fallback on desktop for eg. One is paid the other is free. Concerning IIS, if you use cloud-based players you need to add a `crossdomain.xml` and set up the `CORS` headers for cross domain/ cross-origin requests. – aergistal Apr 23 '15 at 09:16
  • Anyway what is "-g" option? Also I have tried that but I got error: Unknown encoder 'libfdk_aac'. I download ffmpeg builds from http://ffmpeg.zeranoe.com/builds/ It saids "Latest Zeranoe FFmpeg Build Version: git-cbe2700 (2015-04-22)". So I think it is the newest version. – Buzz Apr 23 '15 at 09:22
  • You need to compile FFmpeg with `libfdk_aac` support. It's not supplied with the zeranoe builds. `-g` makes the encoder put a keyframe each second (25f / 25fps). Each segment will start on a keyframe. – aergistal Apr 23 '15 at 09:26
  • Hi @aergistal, thank you so much for your help. I'd like to ask one more question before take decision. I read on http://stackoverflow.com/questions/15687434/what-is-the-difference-between-hls-and-mpeg-dash. What do you think, HLS or DASH perform better? Do you have any good link resource I can start with DASH? – Buzz Apr 24 '15 at 02:04
  • I think I start to understand. So If we set -r to 25 and -g to 25 it means it will create 25fps and create each segment per 25f and thus the client will access each segment with minimum 3 segment to start, is it correct? Therefore, client will also at least 3 second behind from the realtime? – Buzz Apr 24 '15 at 02:10
  • Once more :) Where I can download the complete encoder ffmpeg builds? – Buzz Apr 24 '15 at 02:15
  • If the player buffers 3 segments then yes you'll have 3s + download time. You can compile FFmpeg: https://trac.ffmpeg.org/wiki/CompilationGuide. – aergistal Apr 24 '15 at 08:01