I am working on my first mac osx cocoa app for 10.5+ where I have a CVImageBufferRef (captured using QTKit), I need to transfer this image over TCP Socket to the client app. The client app needs RGB values. So here is what I am currently doing (my current solution works as needed but uses a lot of CPU)
CVImageBufferRef --> NSBitmapImageRep --> NSData --> then transmit NSData over TCP Socket to client app and at client side I have the following code to get the RGB:
UInt8 r,g,b
int width=320;
int height=240;
NSData *data; //read from TCP Socket
NSBitmapImageRep *bitmap=[[NSBitmapImageRep alloc] initWithData:data];
for (y=1;y<=height;y++) {
for(x=1;x<=width;x++){
NSColor *color=[bitmap colorAtX:x y:y];
// ^^ this line is actually a culprit and uses a lot of CPU
r=[color redComponent]*100;
g=[color greenComponent]*100;
b=[color blueComponent]*100;
NSLog(@"r:%d g:%d b:%d",r,g,b);
}
}
[bitmap release];
If CVImageBufferRef can be converted to an array of RGB values that would be perfect otherwise I need an efficient solution to convert NSBitmapImageRep to RGB values.