0

I'm trying to find out a way to calculate the average color of the screen using objective-c.

So far I use this code to get a screen shot, which works great:

CGImageRef image1 = CGDisplayCreateImage(kCGDirectMainDisplay);

NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCGImage:image1];
// Create an NSImage and add the bitmap rep to it...
NSImage *image = [[NSImage alloc] init];
[image addRepresentation:bitmapRep];

Now my problem is to calculate the average RGB color of this image.

I've found one solution, but the R G and B color components were always calculated to be the same (equal):

NSInteger i = 0;
NSInteger components[3] = {0,0,0};
unsigned char *data = [bitmapRep bitmapData];

NSInteger pixels = ([bitmapRep size].width *[bitmapRep size].height);

do {
    components[0] += *data++;
    components[1] += *data++;
    components[2] += *data++;
} while (++i < pixels);

int red = (CGFloat)components[0] / pixels;
int green = (CGFloat)components[1] / pixels;
int blue = (CGFloat)components[2] / pixels;
Jakob Halskov
  • 428
  • 6
  • 22
  • 1
    Are they coming out zero? Or what are the three channel values coming out to be? – I82Much Dec 26 '11 at 22:22
  • Your algorithm assumes that a) each channel within a pixel i) fits into an `unsigned char` and ii) is stored as an integer, and b) that the channels are stored in RGB order, without an alpha component. Are you sure that all these things are true? Do you know the values of `bitsPerPixel`, and `samplesPerPixel`? – jscs Dec 26 '11 at 22:37
  • It always gives me red, green and blue values which are equal. Eg red =87 green=87 blue=87 Josh, No I'm not sure. Basically I need a way to find the best way to get the average RGB color of the screen. – Jakob Halskov Dec 27 '11 at 10:33
  • 1
    See [this answer](http://stackoverflow.com/questions/12147779/how-do-i-release-a-cgimageref-in-ios/12148136#12148136) for an optimized approach. It's about four times faster than iterating over pixel data. – Nikolai Ruhe Aug 28 '12 at 11:57

2 Answers2

3

A short analysis of bitmapRep shows that each pixel has 32 bits (4 bytes) where the first byte is unused, it is a padding byte, in other words the format is XRGB and X is not used. (There are no padding bytes at the end of a pixel row).

Another remark: for counting the number of pixels you use the method -(NSSize)size.
You should never do this! size has nothing to do with pixels. It only says how big the image should be depicted (expressed in inch or cm or mm) on the screen or the printer. For counting (or using otherwise) the pixels you should use -(NSInteger)pixelsWide and -(NSInteger)pixelsHigh. But the (wrong) using of -size works if and only if the resolution of the imageRep is 72 dots per inch.

Finally: there is a similar question at Average Color of Mac Screen

Community
  • 1
  • 1
Heinrich Giesen
  • 1,765
  • 1
  • 11
  • 8
  • How should i change my code to fix this? (Regarding the 4 and not bytes) Your link is to the similar questions, leads me to my own (this) question. – Jakob Halskov Dec 28 '11 at 11:56
  • 2
    The most simple (and most ugly) way to do it, is to add `data++;` as first statement in the do-loop. That skips the first (unused) byte. In case the last byte of a pixel is not used, add `data++;` as last statement in the do-loop. A more careful iteration over all pixels is of course preferable. – Heinrich Giesen Dec 29 '11 at 10:17
2

Your data is probably aligned as 4 bytes per pixel (and not 3 bytes, like you assume). That would (statistically) explain the near-equal values that you get.

Kris Van Bael
  • 2,842
  • 1
  • 18
  • 19