2

I am working on my first mac osx cocoa app for 10.5+ where I have a CVImageBufferRef (captured using QTKit), I need to transfer this image over TCP Socket to the client app. The client app needs RGB values. So here is what I am currently doing (my current solution works as needed but uses a lot of CPU)

CVImageBufferRef --> NSBitmapImageRep --> NSData --> then transmit NSData over TCP Socket to client app and at client side I have the following code to get the RGB:

UInt8 r,g,b
int width=320;
int height=240;

NSData *data; //read from TCP Socket
NSBitmapImageRep *bitmap=[[NSBitmapImageRep alloc] initWithData:data];

for (y=1;y<=height;y++) {
    for(x=1;x<=width;x++){


        NSColor *color=[bitmap colorAtX:x y:y];
        // ^^ this line is actually a culprit and uses a lot of CPU


        r=[color redComponent]*100;
        g=[color greenComponent]*100;
        b=[color blueComponent]*100;
        NSLog(@"r:%d g:%d b:%d",r,g,b);
    }
}

[bitmap release];

If CVImageBufferRef can be converted to an array of RGB values that would be perfect otherwise I need an efficient solution to convert NSBitmapImageRep to RGB values.

Abduliam Rehmanius
  • 928
  • 12
  • 23
  • Something's wrong with your sample code. You don't use `bitmap`, and what is `self.imageData`? – JWWalker Oct 20 '11 at 01:48
  • Sorry, corrected the code, actually In my real code another thread is storing the image in ivar, and second thread is accessing the imageData, the thread safety is maintained using NSLock. – Abduliam Rehmanius Oct 20 '11 at 11:15

2 Answers2

3

You can just get a hold of the bitmap data using bitmapData and take the pixel values you need from it. That should make many cases more than 100x faster.

(the log should go too)

justin
  • 104,054
  • 14
  • 179
  • 226
3

If going straight from a CVImageBufferRef, you can use CVPixelBuffer as it is derived from CVImageBuffer. Use CVPixelBufferLockBaseAddress() and then CVPixelBufferGetBaseAddress() to get a pointer to the first pixel. There is also many other CVPixelBufferGet* methods (http://developer.apple.com/library/mac/#documentation/QuartzCore/Reference/CVPixelBufferRef/Reference/reference.html) to enquire width, height etc for knowledge of the size of data to be transmitted. And of course CVPixelBufferUnlockBaseAddress() once done with the data.

If using the NSBitmapImageRep, I agree with Justin. using [bitmap bitmapData] will return the pointer to the first element. Even with no prior knowledge, you can send the raw data that will be:

unsigned char *data = [bitmap bitmapData];
unsigned long size = [bitmap pixelsWide]*[bitmap pixelsHigh]*[bitmap samplesPerPixel]*sizeof(unsigned char);

This data will then be RGB successive values i.e. 1stRedComponent = *(data + 0), 1stGreenComponent = *(data + 1) 1stBlueComponent = *(data + 2) will give you the first RGB component in the data if you need pixel specific access outside of the NSBitmapImageRep

pontidm
  • 142
  • 5
  • Works perfectly now, CPU Usage dropped from 85% to just 10% :) Actually I tried the [bitmap bitmapData] but it didn't work out yesterday, but after reading your excellent detailed answer it just worked... thanks, and keep it up ;-) – Abduliam Rehmanius Oct 20 '11 at 11:17