18

I can get individual frames from the iPhone's cameras just fine. what I need is a way to package them up with sound for streaming to the server. Sending the files once I have them isn't much of an issue. Its the generation of the files for streaming that I am having problems with. I've been trying to get FFMpeg to work without much luck.

Anyone have any ideas on how I can pull this off? I would like a known working API or instructions on getting FFMpeg to compile properly in an iPhone app.

Sabby
  • 2,586
  • 2
  • 27
  • 41
iHorse
  • 595
  • 3
  • 9
  • 12

1 Answers1

28

You could divide your recording to separate files with a length of say, 10sec, then send them separately. If you use AVCaptureSession's beginConfiguration and commitConfiguration methods to batch your output change you shouldn't drop any frames between the files. This has many advantages over frame by frame upload:

  • The files can be directly used for HTTP live streaming without any server side processing.
  • The gap between data transfers allow the antennas to sleep in between if the connection is fast enough, saving battery life.
  • Conversely, if the connection is slow so upload is slower than recording, managing delayed upload of a set of files is much easier than a stream of bytes.
mohsenr
  • 7,235
  • 3
  • 28
  • 28
  • 1
    Since iHorse found a solution to his issue, enjoy your added rep, proved to be good info for me :) – Aaron Aug 16 '10 at 21:02
  • @iHorse. Would you mind sharing your solution please. My email address is jordan@whackfaqs[dotcom]. Would very much appreciate it. I need to do the same thing. – Jordan Apr 18 '11 at 01:02
  • @Tegeril - Would you mind sharing your solution with me? I actually need to do the same thing. Jordan@whackfaqs[add dotcom] – Jordan Apr 18 '11 at 01:04
  • I didn't ever end up implementing anything here, I was just curious how it would work and thus put a bounty on the question. iHorse, if he is keeping track, might be able to help you more. – Aaron Apr 21 '11 at 20:29
  • @Mo This is tantalizing but seems there's more to the story. Are you suggesting using two (or more) different capture output objects? I've tried this and I can't do it without a pause. I can't start a new recording on the same or a different movie output object until it sends me a successful finished recording message, which takes ~0.1 seconds after I send -stop. – Bored Astronaut Jun 20 '11 at 15:43
  • 7
    @Bored Astronaut - This is how my app used to do it. It can be done without pauses. Use AVAssetWriter. Create two of them. Start one, switch to the next and create a new one on a background queue. – Steve McFarlin Jun 24 '11 at 05:22
  • @BoredAstronaut Jordan - I have made a working solution that doesn't drop very many frames: http://stackoverflow.com/a/13187931/805882 It uses the approach SteveMcFarlin described. – Chris Ballinger Nov 02 '12 at 01:24
  • Could anyone please help me with my live streaming problem? Thanks :) http://stackoverflow.com/questions/20894810/how-do-i-stream-video-from-iphone-acting-as-a-server – johk95 Jan 03 '14 at 12:01
  • "The files can be directly used for HTTP live streaming without any server side processing" This statement does not seem to be correct. The MOV files are not compatible with .ts files that HLS required. – EmilyJ Mar 20 '14 at 19:27