1

I am using ffmpeg to create an mp4 file on my server. I am also trying to use qt fast start to be able to move the moov atom to the front so it will stream. I have searched all over the internet with no luck. Is it possible to put my video/audio in a mp4 buffer type file and then be able to play it while ffmpeg is still dumping video and audio data into the stream? the point is I am trying to stream from a camera and Android is horrid... I know both ios and android support mp4 so I was trying to figure a way I can make my rtsp Mp4.

main point of the story: I want to continuously feed my mp4 container my camera feed and still be able to playback the file os my clients can watch.

any help appreciated thank you.

Dnaso
  • 1,335
  • 4
  • 22
  • 48
  • You are mixing up "Fast start" (QuickTime terminology for Progressive download) and RTSP (a protocol that enables a client to command a streaming server). Progressive download will not help you because the moov atom can only be written when the encoding is finished. If you want to stream from a camera, you need a streaming server – Duvrai Feb 26 '14 at 22:00
  • @Duvra I completely understand the difference. What I needed to know is if there was a way to read from an mp4 before it was completely done being written to.. as FFmpeg was still encoding the rtsp stream – Dnaso Feb 26 '14 at 22:21
  • My fault, I didn't read your question well. Maybe you could clarify the question a bit and use some paragraphs. As I understand it now, you have a linux box that receives an rtsp stream. You want to output it to 1) a fast-start file and 2) a live stream, correct? – Duvrai Feb 26 '14 at 22:33
  • Yes, the main problem I am having is this: ffserver I cant find a suitable output file to play on android or iphone. I dont know why this is such black box thing in 2014. I was wondering if there was a way to continually feed the mp4 container AND progressively download to the client so A: there is a buffer eventhough its live, B: plays on android and IPhone – Dnaso Feb 26 '14 at 22:46

1 Answers1

1

You can publish a live stream and when the stream has ended you publish the progressive download.

In FFmpeg, to stream live and save a duplicate of that stream into a file at the same time without encoding twice you can use the Tee pseudo-mixer. Something like this:

ffmpeg \
  -i <input-stream> \
  -f tee "[movflags=+faststart]output.mp4|http://<ffserver>/<feed_name>"

Update: You might try to directly stream a fragmented mp4.

Update 2:

  • Create a fragmented mp4:

    ffmpeg -i input -frag_duration 1000 stream.mp4
    
  • Normally, when serving a file using a web server it will want to know the file size, so to serve the file without knowing it's size, you need to configure your web server to do Chunked Transfer Encoding.

Community
  • 1
  • 1
Duvrai
  • 3,398
  • 3
  • 20
  • 21
  • I know I can do that. What i want to do is figure a way to do it real time. for example AXIS cameras and dlink cameras let you stream their stream from the cam (live) via mp4. if they can do it there has to be a way to do it. – Dnaso Feb 26 '14 at 23:40
  • @Dnaso Did you try encoing as fragmented mp4 and http serving with Chunked Transfer Encoding as I proposed in update 2? – Duvrai Feb 27 '14 at 14:15
  • can you point me to a link? I have not heard of it – Dnaso Feb 27 '14 at 21:07
  • could you post a resource link? – Dnaso Mar 01 '14 at 21:03
  • I am going to check it out. if this works...... pretty amazing thank you – Dnaso Mar 01 '14 at 21:06
  • it works, did you see how to keep polling for more info? ie it just stops because it thinks the video is over (it does work though I can now play before encoding stops) – Dnaso Mar 03 '14 at 18:01