1

I'm trying to get the color of a pixel from a UIImage, using a scroll view to scroll around an image imported from the photos library, or taken from the camera. Similar to this thread, here's the code I'm using to get the color of a certain pixel:

var image: UIImage?
private var imageData: CFData?
private var imageByteData: UnsafePointer<UInt8>?

private func processImageData() {
    if let fixedOrientationImage = image!.fixOrientation() {
        imageData = fixedOrientationImage.cgImage?.dataProvider?.data
        imageByteData = CFDataGetBytePtr(imageData)
    }
}

func getColorFromPixel(_ pixel: CGPoint) -> UIColor? {

    let pixelByteLocation = ((Int(image!.size.width) * Int(pixel.y)) + Int(pixel.x)) * 4

    if let data = imageByteData {
        //fixOrientation() returns image in bgra format, not rgba

        let b = CGFloat(data[pixelByteLocation])/255
        let g = CGFloat(data[pixelByteLocation + 1])/255
        let r = CGFloat(data[pixelByteLocation + 2])/255
        let a = CGFloat(data[pixelByteLocation + 3])/255

        return UIColor(red: r, green: g, blue: b, alpha: a)

    } else {
        return nil
    }
}

I'm also fixing the image's orientation to be "up" before I set the imageData, so the CGImage has proper alignment. Similar to this answer, here's the code I am using to fix the image's orientation.

extension UIImage {

    func fixOrientation() -> UIImage? {

        UIGraphicsBeginImageContextWithOptions(size, false, scale)
        self.draw(in: CGRect(x: 0, y: 0, width: size.width, height: size.height))
        let normalizedImage = UIGraphicsGetImageFromCurrentImageContext()
        UIGraphicsEndImageContext()

        return normalizedImage
    }
}

When I scroll around my original image, the pixel color that is returned is correct for every image orientation, except for images taken with the front camera in portrait or portrait upside down. Everything else works, including front landscape and every orientation using the back camera.

When I scroll around the image, it looks like the location of the pixel color is what is wrong, not the color itself (so it's not a RGBA issue).

I also tried manually applying a transform to the UIImage, using this code, but I get the same results. Am I missing something?

rmaddy
  • 314,917
  • 42
  • 532
  • 579
Chris
  • 11
  • 2
  • I don't see how your `fixOrientation` fixes any orientation. You are not asking for the image's `imageOrientation` or checking its Exif info, so where does the fixing happen? – matt Feb 01 '18 at 20:38
  • "I'm also fixing the image's orientation to be "up" before I set the imageData, so the CGImage has proper alignment" No, I don't get that. It seems to me that the question revolves around the correspondence between the image in memory, whose pixels you intend to examine, and the image shown to the user. But what image is it that you show to the user? Does it appear oriented correctly? If so, then surely you shouldn't need to "fix" anything at all? After all, UIImageView takes no account of the `imageOrientation`. – matt Feb 01 '18 at 20:42
  • The image I am presenting to the user is the one that gets returned from the `ImagePickerController`. If I set the `imageData` to this image instead of `fixedOrientationImage`, then the pixel colors are only correct when photos are taken in the `up` orientation. If instead, I change the orientation from, for example, `right` to `up`, and present that to the user, the image is rotated, but the color pixels are then correct. That's why I was trying to call the `fixOrientation()` method for the image in memory – Chris Feb 01 '18 at 21:37
  • You say that, but I still don't see any code that changes the orientation. And anyway, that's not how to rotate an image. And in any case it still all depends on making what you _display_ correspond to the bitmap you're examining, and I don't know what you're displaying. – matt Feb 01 '18 at 23:24

0 Answers0