30

Am Captuing video using AVFoundation frame work .With the help of Apple Documentation http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_MediaCapture.html%23//apple_ref/doc/uid/TP40010188-CH5-SW2

Now i did Following things

1.Created videoCaptureDevice
2.Created AVCaptureDeviceInput and set videoCaptureDevice
3.Created AVCaptureVideoDataOutput and implemented Delegate
4.Created AVCaptureSession - set input as AVCaptureDeviceInput and set output as AVCaptureVideoDataOutput

5.In AVCaptureVideoDataOutput Delegate method

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

i got CMSamplebuffer and Converted into UIImage And tested to print UIImageview using

[self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];

Every thing went well up to this........

MY Problem IS, I need to send video frames through UDP Socket .even though following one is bad idea i tried ,UIImage to NSData and Send via UDP Pocket. BUt got so Delay in video Processing.Mostly problem because of UIImage to NSDate

So Please GIve me Solution For my problem

1)Any way to convert CMSampleBUffer or CVImageBuffer to NSData ??
2)Like Audio Queue Service and Queue for Video to store UIImage and do UIImage to NSDate And Sending ???

if am riding behind the Wrong Algorithm Please path me in write direction

Thanks In Advance

Vasu Ashok
  • 1,413
  • 3
  • 17
  • 37

2 Answers2

39

Here is code to get at the buffer. This code assumes a flat image (e.g. BGRA).

NSData* imageToBuffer( CMSampleBufferRef source) {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

    NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    return [data autorelease];
}

A more efficient approach would be to use a NSMutableData or a buffer pool.

Sending a 480x360 image every second will require a 4.1Mbps connection assuming 3 color channels.

Steve McFarlin
  • 3,576
  • 1
  • 25
  • 24
  • Thanks for your response . CVimagebuffer to NSdata is ok. I have one small doubt ,in receiver i Couldn't get back the CVimagebuffer,bcoz that need height,width separately and bytes per row too – Vasu Ashok Jun 01 '11 at 07:29
  • You could send this in the data 'header' if you wanted. [width:16bits][height:16bits][sequence number:16bits][count:16bit][data fragment]. There are other problems with your setup. Even assuming a MTU of 1500 your images are going to be spread out over many UDP packets. You are going to lose packets given the bitrate you need. I suggest you use TCP. I can see no advantage to using UDP given you will not be able to push more then maybe 2 frames per second out of the phone over a LAN/Wifi (@480x360). And at this rate you need to care about a lost video data. – Steve McFarlin Jun 01 '11 at 07:42
  • Alternatively you could send the video information ahead of time to the receiver, or in the first UDP packet. You will still need some way to identify image fragments. – Steve McFarlin Jun 01 '11 at 07:45
  • 1
    How do you convert back to a image buffer? – Paul Solt Sep 23 '11 at 16:41
  • 2
    This code is giving data length 3686400 . but image is just 3000 – sajwan Jan 28 '13 at 21:30
  • What do you mean image is just 3000? Do you mean just 3000 pixels in total? What format is the image (RGBA, YUV etc...)? – Steve McFarlin Jan 28 '13 at 22:48
  • 3
    I mean when i NSLog(@"%i",[data length]); Then its logging 3686400 always. – sajwan Feb 01 '13 at 20:17
  • 1
    Using the above code to get NSData, how would you get back to CVImageBuffer by using CGImageCreate? I can't see the link. Or is there another way? – noRema Mar 18 '13 at 09:57
  • I am not sure. I would just use a CVPixelBuffer and copy the contents of the NSData over to the CVPixelBuffer. – Steve McFarlin Mar 20 '13 at 20:00
  • @SteveMcFarlin how to get bytes from audio buffer? – prabhu Mar 14 '19 at 12:32
2

Use CMSampleBufferGetImageBuffer to get CVImageBufferRef from the sample buffer, then get the bitmap data from it with CVPixelBufferGetBaseAddress. This avoids needlessly copying the image.

Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • Thanks for Your reply .Already i used CVPixelBufferGetBaseAddress but i can use that as bytes? and using that alone ,can i able to draw image in receiver side??? – Vasu Ashok Jun 01 '11 at 04:57
  • Yes, those are bytes. There are CVPixelBufferGetHeight()*CVPixelBufferGetBytesPerRow() of them. And if you do it right, you'll be able to reconstruct the image at the other end. – Rhythmic Fistman Jun 01 '11 at 06:36
  • Does this work the same for planar format? Or does one need to use CVPixelBufferGetHeightOfPlane and CVPixelBufferGetBytesPerRowOfPlane instead? –  Apr 01 '14 at 00:28
  • You should use the planar versions. Saves you decoding the CVPlanarPixelBufferInfo_YCbCrBiPlanar struct. – Rhythmic Fistman Apr 01 '14 at 04:07