I have downloaded ffmpeg lib file and complied it for armv7. I added ffmpeg lib files in my project successfully. i am able to get iphone camera live streams using AVFoundation.
Now the problem is how will i convert iphone camera streams output as a input of ffmpeg for decode? Check my code
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
CMBlockBufferRef bufferData = CMSampleBufferGetDataBuffer(sampleBuffer);
size_t lengthAtOffset;
size_t totalLength; char* data;
if(CMBlockBufferGetDataPointer(bufferData, 0, &lengthAtOffset, &totalLength, &data ) != noErr ){ NSLog(@"error!"); } }
Kindly suggested me which function of ffmpeg lib is used for decoding and how will i put CMBlockBufferRef
as a input of this??
Thanks