4

Based on the responses to a previous question, I've created a category on UIImageView for extracting pixel data. This works fine in the simulator, but not when deployed to the device. I should say not always -- the odd thing is that it does fetch the correct pixel colour if point.x == point.y; otherwise, it gives me pixel data for a pixel on the other side of that line, as if mirrored. (So a tap on a pixel in the lower-right corner of the image gives me the pixel data for a corresponding pixel in the upper-left, but tapping on a pixel in the lower-left corner returns the correct pixel colour). The touch coordinates (CGPoint) are correct.

What am I doing wrong?

Here's my code:

@interface UIImageView (PixelColor)
- (UIColor*)getRGBPixelColorAtPoint:(CGPoint)point;
@end

@implementation UIImageView (PixelColor)

- (UIColor*)getRGBPixelColorAtPoint:(CGPoint)point
{
    UIColor* color = nil;

    CGImageRef cgImage = [self.image CGImage];
    size_t width = CGImageGetWidth(cgImage);
    size_t height = CGImageGetHeight(cgImage);
    NSUInteger x = (NSUInteger)floor(point.x);
    NSUInteger y = height - (NSUInteger)floor(point.y);

    if ((x < width) && (y < height))
    {
        CGDataProviderRef provider = CGImageGetDataProvider(cgImage);
        CFDataRef bitmapData = CGDataProviderCopyData(provider);
        const UInt8* data = CFDataGetBytePtr(bitmapData);
        size_t offset = ((width * y) + x) * 4;
        UInt8 red = data[offset];
        UInt8 blue = data[offset+1];
        UInt8 green = data[offset+2];
        UInt8 alpha = data[offset+3];
        CFRelease(bitmapData);
        color = [UIColor colorWithRed:red/255.0f green:green/255.0f blue:blue/255.0f alpha:alpha/255.0f];
    }

    return color;
}
Community
  • 1
  • 1
Shaggy Frog
  • 27,575
  • 16
  • 91
  • 128
  • can you check the value of self.image.imageOrientation? it's possible that the image you're using is in UIImageOrientationLeftMirrored which would be the reflection you're seeing, though I don't know why it would be that way only on the device.... – David Maymudes Dec 16 '09 at 04:47
  • It's UIImageOrientationUp on both simulator and device. I'm not sure that "left mirrored" is the correct interpretation of what's happening, because the reflection is (seemingly) taking place along the diagonal y=x, and UIImageOrientationLeftMirrored is a simple rotation of the entire image "90 deg CCW" according to the SDK. – Shaggy Frog Dec 16 '09 at 06:41
  • More info: if I swap how x and y are being calculated, the behaviour is reversed -- it works on the device, but not on the simulator. (Although the exact positioning on the device seems to be off by a few degrees CCW, but it could be the imprecision of my finger versus using the mouse) – Shaggy Frog Dec 16 '09 at 20:57

3 Answers3

5

I think R B G is wrong. You have:

UInt8 red =   data[offset];     
UInt8 blue =  data[offset+1];
UInt8 green = data[offset+2];

But don't you really mean R G B? :

UInt8 red =   data[offset];     
UInt8 green = data[offset+1];
UInt8 blue =  data[offset+2];

But even with that fixed there's still a problem as it turns out Apple byte swaps (great article) the R and B values when on the device, but not when on the simulator.

I had a similar simulator/device issue with a PNG's pixel buffer returned by CFDataGetBytePtr.

This resolved the issue for me:

#if TARGET_IPHONE_SIMULATOR
        UInt8 red =   data[offset];
        UInt8 green = data[offset + 1];
        UInt8 blue =  data[offset + 2];
#else
        //on device
        UInt8 blue =  data[offset];       //notice red and blue are swapped
        UInt8 green = data[offset + 1];
        UInt8 red =   data[offset + 2];
#endif

Not sure if this will fix your issue, but your misbehaving code looks close to what mine looked like before I fixed it.

One last thing: I believe the simulator will let you access your pixel buffer data[] even after CFRelease(bitmapData) is called. On the device this is not the case in my experience. Your code shouldn't be affected, but in case this helps someone else I thought I'd mention it.

Monte Hurd
  • 4,349
  • 5
  • 30
  • 35
  • Took me some time to finally get back to this, but your code was bang-on. The combination of mixing up G and B, and not knowing about the byte ordering change Xcode does behind the scenes, was a fatal combination. – Shaggy Frog Jul 28 '10 at 23:05
0

You could try the following alternative approach:

  • create a CGBitmapContext
  • draw the image into the context
  • call CGBitmapContextGetData on the context to get the underlying data
  • work out your offset into the raw data (based on how you created the bitmap context)
  • extract the value

This approach works for me on the simulator and device.

Andrew Ebling
  • 10,175
  • 10
  • 58
  • 75
0

It looks like that in the code posted in the original questions instead of:

NSUInteger x = (NSUInteger)floor(point.x);
NSUInteger y = height - (NSUInteger)floor(point.y);

It should be:

NSUInteger x = (NSUInteger)floor(point.x);
NSUInteger y = (NSUInteger)floor(point.y);
Gu1234
  • 3,466
  • 3
  • 23
  • 24