2

I'm working on a task where given an image file stored locally (png/ jpg), I have to extract the rgb pixel values and input that to a different function. The problem I have faced is, the rgb values I get from ios simulator environment and on ios device is different resulting the output from next function to be very different as well. Has anyone faced similar issue? What could be the problem for this strange behaviour?

I have used swiftimage library and another different method to extract the rgb values and they both product same output on each device (but different between across each devices)

Using swiftimage library this is how I exract rgbs (from github.com/koher/swift-image)

extension UIImage {
    func extractrgbValues() -> [Float] {
        let swImage = Image<RGB<Float>>(uiImage: self)
        let width = swImage.width
        let height = swImage.height
        
        var reds = [[Float]](repeating: [Float](repeating: 0, count: width), count: height)
        var greens = [[Float]](repeating: [Float](repeating: 0, count: width), count: height)
        var blues = [[Float]](repeating: [Float](repeating: 0, count: width), count: height)
        
        // data is stored columnwise and we have to flip i,j to reconstruct it row-wise
        for i in 0..<width {
            for j in 0..<height {
                let pixel = swImage[i,j]
                reds[j][i] = pixel.red
                greens[j][i] = pixel.green
                blues[j][i] = pixel.blue
            }
        }
        return [reds, greens, blues].flatMap { $0 }.flatMap { $0 }
    }
}

Other reference I've tried is an answer from this post Get Pixel color of UIImage

For the very same image, pixel values on pc/ android environment are almost identical. But on iOS both device and simulator produce very different outcomes and neither is close to pc/android output.

HangarRash
  • 7,314
  • 5
  • 5
  • 32
nsuinteger
  • 1,503
  • 12
  • 21
  • 2
    You are using unknown object `Image>`, also the `pixel` is undefined, so looks like when you call `pixel.red` it will give the same result for all `i` and `j`. Try to learn CGImage, it may help ( https://developer.apple.com/documentation/coregraphics/cgimage ) – Володимир Ukraine Jan 25 '23 at 13:50
  • Thanks for the message. Image> is a type from swift-image library. https://github.com/koher/swift-image. I've fixed the reference to `pixel` property. As for the CGImage, I've used a function from this post https://stackoverflow.com/questions/3284185/get-pixel-color-of-uiimage which is a cgimage function, which too provides the same outcome – nsuinteger Jan 25 '23 at 14:19

1 Answers1

3

I would expect given this line that it's using a device-calibrated RGB colorspace. When you decode the image, Core Graphics adjusts it to display the correct, calibrated colors on this device's screen. If you want the pixel data to match other platforms, you need to decode it using the same colorspace they do. For PC and Android, the default colorspace is .sRGB (the so-called "standard Red Green Blue" color space defined by IEC 61966-2-1).

Note that if you then display the image locally, the colors will not match other calibrated displays.

You can set the colorspace in CIContext using createCGImage(_:from:format:colorSpace:).

If you already have the CGImage, you can make a new CGImage in a different colorspace using copy(colorSpace:).

Even with the same color space, JPEG may not decode identically on different systems. This is permitted by spec as long as the results are within a specified tolerance. Lossless formats like PNG or TIFF should always decode identically on all platforms.

Rob Napier
  • 286,113
  • 34
  • 456
  • 610
  • Thanks, the colorspace seemed to be the issue between mac and iphones outputting different values. However using the same colorspace on other platforms did not give same results on iOS. We have discovered that if the input image is either png / tiff, then all platforms output same values. So the problem seem to be with iOS's jpg decoding resulting in different RGB values. Thoughts? – nsuinteger Jan 31 '23 at 11:52
  • Are the image bundle resources, or do you download them? Xcode pre-processes JPEGs during the resource bundling step to re-compress them (at least it did in the early days, and I assume they never got rid of that). – Rob Napier Jan 31 '23 at 13:50
  • Note that if the differences are fairly small, that's also allowed by spec. https://stackoverflow.com/a/23714501/97337 But if the differences are large, I would expect it to be the Xcode pre-processing step. – Rob Napier Jan 31 '23 at 13:57
  • Thanks again. Right on the money with differences allowed by specification is causing this. Explains why the lossless formats provide identical values. Can you update your main answer above with the spec difference info, so I will mark it as the right answer. PC and android are using sRGB by default. I've changed iOS code to reflect and the differences have reduced greatly (not identical). Cheers! – nsuinteger Feb 01 '23 at 00:57
  • 1
    Updated to include your confirmations. – Rob Napier Feb 01 '23 at 13:11