0

I have a UIImageView and I draw to it using UIColor orangeColor. Now, I have a function that is supposed to detect the pixelColor of a pixel tapped on.

R: 1.000000 G: 0.501961 B: 0.000000

That's the RGB value I receive when attempting to detect the pixelColor for UIOrange

It should be.

R: 1.000000 G: 0.5 B: 0.000000

Here's my function

- (UIColor *)colorAtPixel:(CGPoint)point {
    // Cancel if point is outside image coordinates
    if (!CGRectContainsPoint(CGRectMake(0.0f, 0.0f, _overlay_imageView.frame.size.width, _overlay_imageView.frame.size.height), point)) {
        return nil;
    }


    // Create a 1x1 pixel byte array and bitmap context to draw the pixel into.
    // Reference: http://stackoverflow.com/questions/1042830/retrieving-a-pixel-alpha-value-for-a-uiimage
    NSInteger pointX = trunc(point.x);
    NSInteger pointY = trunc(point.y);
    CGImageRef cgImage = _overlay_imageView.image.CGImage;
    NSUInteger width = CGImageGetWidth(cgImage);
    NSUInteger height = CGImageGetHeight(cgImage);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    int bytesPerPixel = 4;
    int bytesPerRow = bytesPerPixel * 1;
    NSUInteger bitsPerComponent = 8;
    unsigned char pixelData[4] = { 0, 0, 0, 0 };
    CGContextRef context = CGBitmapContextCreate(pixelData,
                                                 1,
                                                 1,
                                                 bitsPerComponent,
                                                 bytesPerRow,
                                                 colorSpace,
                                                 kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGColorSpaceRelease(colorSpace);
    CGContextSetBlendMode(context, kCGBlendModeCopy);

    // Draw the pixel we are interested in onto the bitmap context
    CGContextTranslateCTM(context, -pointX, -pointY);
    CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, (CGFloat)width, (CGFloat)height), cgImage);
    CGContextRelease(context);

    // Convert color values [0..255] to floats [0.0..1.0]
    CGFloat red   = (CGFloat)pixelData[0] / 255.0f;
    CGFloat green = (CGFloat)pixelData[1] / 255.0f;
    CGFloat blue  = (CGFloat)pixelData[2] / 255.0f;
    CGFloat alpha = (CGFloat)pixelData[3] / 255.0f;

    return [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
}

Any ideas?

I must mention, my UIImageView has a clearBackground, and its ontop of a black canvas. Is that maybe the issue?

digit
  • 197
  • 9
  • I'm not sure what type of help you are asking for here. Could you be more specific regarding how this is a problem? – Brian Anderson Nov 17 '17 at 22:57
  • Yes, I need to compare color values to UIColor colornameColor to see if I match a correct pixel. – digit Nov 18 '17 at 00:07

1 Answers1

1

There's nothing wrong with your function. This is a a result of floating point math. Half of an integer 255 (the max value of an unsigned byte) is either 127/255.0 or 128/255.0 depending on how you round. Neither one of those is 0.5. They are 0.498039215686275 and 0.501960784313725 respectively.

EDIT: I guess I should add that the colors in the CGImage are stored as bytes, not floats. So when you create your orange with a float in UIColor its getting truncated to R:255, G:128, B:0 A:255. When you read this back as a float you get 1.0 0.501961 B: 0.0 A: 1.0

Josh Homann
  • 15,933
  • 3
  • 30
  • 33
  • So would it be better to use 256.0 as the constant denominator as many of the primary and secondary colors are built from factors of two? – Brian Anderson Nov 18 '17 at 00:21
  • No; 255 is the correct max value of an UInt8 . If you really care about the rounding then just specify all of your colors in bytes and there will be no conversion or loss of precision. You only need floating point numbers if you are using wide color with display P3. – Josh Homann Nov 18 '17 at 00:37
  • Josh Homann: can you possibly update my function to work with Bytes so I can see? – digit Nov 18 '17 at 17:42