3

I have downloaded ffmpeg lib file and complied it for armv7. I added ffmpeg lib files in my project successfully. i am able to get iphone camera live streams using AVFoundation.

Now the problem is how will i convert iphone camera streams output as a input of ffmpeg for decode? Check my code

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {


CMBlockBufferRef bufferData = CMSampleBufferGetDataBuffer(sampleBuffer);
size_t lengthAtOffset;
size_t totalLength; char* data; 

if(CMBlockBufferGetDataPointer(bufferData, 0, &lengthAtOffset, &totalLength, &data ) != noErr ){ NSLog(@"error!"); } }

Kindly suggested me which function of ffmpeg lib is used for decoding and how will i put CMBlockBufferRef as a input of this??

Thanks

Inder Kumar Rathore
  • 39,458
  • 17
  • 135
  • 184

1 Answers1

0

you can use OPenTok for live video straming

https://github.com/opentok/opentok-ios-sdk

Vikas Ojha
  • 444
  • 1
  • 8
  • 23
  • sir see that question this is pravi (nikhil's friend) http://stackoverflow.com/questions/11986313/http-live-streaming-for-iphone-and-why-we-use-m3u8-file – Prabhjot Singh Gogana Aug 17 '12 at 07:53