10

According to this What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer? is possible to get compressed data from iphone camera, but as I've been reading in the AVFoundation reference you only get uncompressed data.

So the questions are:

1) How to get compressed frames and audio from iPhone's camera?

2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?

Any help will be really appreciated.

Thanks.

Community
  • 1
  • 1
Alexandre OS
  • 687
  • 6
  • 17
  • I ended up getting uncompressed data (Frames and Audio) from AVFoundation and encoding+streaming using FFmpeg's API. It works pretty well in the iPhone 4, getting up to 30 FPS with resolution of 192x240. In higher resolutions it drops too many frames. – Alexandre OS Aug 22 '12 at 13:46
  • @AlexandreOS How to to do this please share it, It helpful for us ,Thanks – Rahul Juyal Sep 13 '12 at 05:36
  • 1
    @Ron [Get uncompressed data from AVFoundation](http://developer.apple.com/library/ios/#DOCUMENTATION/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2), then convert each [CMSampleBuffer to FFmpeg's AVPicture](http://stackoverflow.com/questions/4499160/how-to-convert-cmsamplebuffer-uiimage-into-ffmpegs-avpicture). You can encode the AVPicture instance using FFmpeg. Take a look at FFmpeg's [ffmpeg.c](http://ffmpeg.org/doxygen/trunk/ffmpeg_8c-source.html) file as example of how to achieve this encoding part. Hope this helps you. – Alexandre OS Sep 14 '12 at 13:28

2 Answers2

9

You most likely already know....

1) How to get compressed frames and audio from iPhone's camera?

You can not do this. The AVFoundation API has prevented this from every angle. I even tried named pipes, and some other sneaky unix foo. No such luck. You have no choice but to write it to file. In your linked post a user suggest setting up the callback to deliver encoded frames. As far as I am aware this is not possible for H.264 streams. The capture delegate will deliver images encoded in a specific pixel format. It is the Movie Writers and AVAssetWriter that do the encoding.

2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?

Yes it is. However, you will have to use libx264 which gets you into GPL territory. That is not exactly compatible with the app store.

I would suggest using AVFoundation and AVAssetWriter for efficiency reasons.

Steve McFarlin
  • 3,576
  • 1
  • 25
  • 24
  • 4
    Actually 1 is partially wrong. I have written a library that delivers H.264 data in realtime as it is being encoded without using a private API. – Steve McFarlin May 13 '11 at 04:30
  • 1
    Could you share this library with us? It would be very good to have an alternative way to get this H.264 streams without using libx264. That's why I ended up using mpeg codecs from FFmpeg. – Alexandre OS Aug 22 '12 at 13:53
  • @AlexandreOS Unfortunately my library is only commercially licensed. I may in the future release it under a dual license. – Steve McFarlin Aug 23 '12 at 21:54
  • @SteveMcFarlin Is that the library you had written being using avassetwriterinput and pixel buffer? Please correct me. – Splendid Aug 24 '12 at 13:16
  • @SteveMcFarlin any recent progress on providing some insight to the rest of us on the first point you mentioned? – abestic9 Jan 29 '14 at 02:45
  • @AndrewBestic Just parse the MDAT as a video is being recorded. It is in avcc format. e.g. Generally first 4 bytes are the size of the NALU. You will have to write a temp file and parse the stsd atom (I think) to get the SPS and PPS. What I did is similar to this: http://www.gdcl.co.uk/2013/02/20/iOS-Video-Encoding.html. Only difference is I wrote most of my code in C, and my parser and handling of the SPS/PPS data is a bit more robust. – Steve McFarlin Jan 30 '14 at 21:50
  • That sounds good SPS is something I came across today. Looks like writing a temp file is the only way around this. – abestic9 Jan 31 '14 at 11:45
4

I agree with Steve. I'd add that on trying with Apple's API, you're going to have to do some seriously nasty hacking. AVAssetWriter by default spends a second before spilling its buffer to file. I haven't found a way to change that with settings. The way around that seems to be to force small file writes and file close with the use of multiple AVAssetWriters. But then that introduces lots of overhead. It's not pretty.

Definitely file a new feature request with Apple (if you're an iOS developer). The more of us that do, the more likely they'll add some sort of writer that can write to a buffer and/or to a stream.

One addition I'd make to what Steve said on the x264 GPL issue is that I think you can get a commercial license for that which is better than GPL, but of course costs you money. But that means you could still use it and get pretty OK results, and not have to open up your own app source. Not as good as an augmented Apple API using their hardware codecs, but not bad.

monadical
  • 86
  • 3