7

I know how to get frame from iOS sdk. [How to capture video frames from the camera as images using AV Foundation(http://developer.apple.com/library/ios/#qa/qa1702/_index.html)] It's pixel, and i can transfer it to JPEG.

What the way I want to transfer the video is like this:

One iOS device A:

  1. Get the pixel or JPEG from call function

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

  1. Using existed technology encoding to h.264 - ffmpeg

  2. encapsulate video with TS stream

  3. Run http server, and wait for request

The other iOS device B:

  1. http request to A(using http simply instead of rtp/rtsp)

So my question is, do I need to use ffmpeg to get h.264 stream or i can get from iOS API? If I use ffmpeg to encode to h.264(libx264), how to do that, is there any sample code or guideline?

I've read the post What's the best way of live streaming iphone camera to a media server? It's a pretty good discussion, but i want to know the detail.

Community
  • 1
  • 1
user855118
  • 71
  • 1
  • 2

2 Answers2

0

The license for ffmpeg is incompatible with iOS applications distributed through the App Store.

If you want to transfer realtime video and have any kind of a usable frame rate, you won't want to use http nor TCP.

Eric
  • 2,045
  • 17
  • 24
0

Although this doesn't directly answer your question about which video format to use, I would suggest looking into some 3rd party frameworks like ToxBox or QuickBlox. There is a fantastic tutorial using Parse and OpenTok here:

http://www.iphonegamezone.net/ios-tutorial-create-iphone-video-chat-app-using-parse-and-opentok-tokbox/

cleverbit
  • 5,514
  • 5
  • 28
  • 38