5

I have an NSImage. I would like to read the NSColor for a pixel at some x and y. Xcode seems to thing that there is a colorAtX:y: method on NSImage, but this causes a crash saying that there is no such method for NSImage. I have seen some examples where you create an NSBitmapImageRep and call the same method on that, but I have not been able to successfully convert my NSImage to an NSBitmapImageRep. The pixels on the NSBitmapImageRep are different for some reason.

There must be a simple way to do this. It cannot be this complicated.

mtmurdock
  • 12,756
  • 21
  • 65
  • 108
  • 1
    In what way are the "pixels on the NSBitmapImageRep" different? I have done it that way before and gotten good results. – user1118321 Mar 15 '12 at 20:13
  • See Rob Keniger's answer below. The anchor point (origin) of NSImage is top-left, and NSBitmapImageRep is bottom-left. When you convert from one to the other, the image coordinates are flipped vertically. – mtmurdock Mar 19 '12 at 19:25
  • possible duplicate of [Get pixels and colours from NSImage](http://stackoverflow.com/questions/1994082/get-pixels-and-colours-from-nsimage) – smokris Jul 21 '14 at 03:25

2 Answers2

11

Without seeing your code it's difficult to know what's going wrong.

You can draw the image to an NSBitmapImageRep using the initWithData: method and pass in the image's TIFFRepresentation.

You can then get the pixel value using the method colorAtX:y:, which is a method of NSBitmapImageRep, not NSImage:

NSBitmapImageRep* imageRep = [[NSBitmapImageRep alloc] initWithData:[yourImage TIFFRepresentation]];
NSSize imageSize = [yourImage size];
CGFloat y = imageSize.height - 100.0;
NSColor* color = [imageRep colorAtX:100.0 y:y];
[imageRep release];

Note that you must make an adjustment for the y value because the colorAtX:y method uses a coordinate system that starts in the top left of the image, whereas the NSImage coordinate system starts at the bottom left.

Alternatively, if the pixel is visible on-screen then you can use the NSReadPixel() function to get the color of a pixel in the current coordinate system.

Rob Keniger
  • 45,830
  • 6
  • 101
  • 134
  • Ok so I am trying it this way, but something seems off. Is the coordinate system of an NSBitmapImageRep different than an NSImage? Everything seems to be flipped vertically. It would seem that NSImage is being drawn from the bottom left (like view's are by default) and the NSBitmapImageRep is drawing from the top left (thus flipping the image). I'm not actually drawing to the screen, so I can't see. – mtmurdock Mar 16 '12 at 15:46
  • 1
    It appears you are correct, the coordinates are flipped. I've added code to account for that to my answer. – Rob Keniger Mar 16 '12 at 22:29
  • Thanks for your help Rob. I ran into another road block while editing the image, and found that I had to create a new NSImage instance from the altered NSBitmapImageRep. So my code ended up being along the lines of create NSBitmapImageRep from NSImage, read pixels, change pixels, create NSImage from altered NSBitmapImageRep, assign new NSImage to view. Thanks for the help! – mtmurdock Mar 19 '12 at 19:21
  • 3
    Note that you might want to use imageRep.pixelsWide and imageRep.pixelsHigh, as these are resolution independent. When using the size as in this example it doesn't necessarily correspond to the pixels you are trying to read. – Nathan S. Apr 11 '15 at 05:32
  • Thanks @NathanS. -- I just got burned by that :) – chinabuffet Sep 20 '15 at 10:13
1

Function colorAtX of NSBitmapImageRep seems not to use the device color space, which may lead to color values that are slightly different from what you actually see. Use this code to get the correct color in the current device color space:

[yourImage lockFocus]; // yourImage is just your NSImage variable
NSColor *pixelColor = NSReadPixel(NSMakePoint(1, 1)); // Or another point
[yourImage unlockFocus];
Ely
  • 8,259
  • 1
  • 54
  • 67