I know how to get frame from iOS sdk. [How to capture video frames from the camera as images using AV Foundation(http://developer.apple.com/library/ios/#qa/qa1702/_index.html)] It's pixel, and i can transfer it to JPEG.
What the way I want to transfer the video is like this:
One iOS device A:
- Get the pixel or JPEG from call function
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
Using existed technology encoding to h.264 - ffmpeg
encapsulate video with TS stream
Run http server, and wait for request
The other iOS device B:
- http request to A(using http simply instead of rtp/rtsp)
So my question is, do I need to use ffmpeg to get h.264 stream or i can get from iOS API? If I use ffmpeg to encode to h.264(libx264), how to do that, is there any sample code or guideline?
I've read the post What's the best way of live streaming iphone camera to a media server? It's a pretty good discussion, but i want to know the detail.