0

I'm trying to get the CGColor of a specifiy point from my NSImage (png file). This function is called on NSViews which can be dragged over my NSImageView. Then this function should set a variable (defaultColor) to the Color which is exactly at the position from the NSView on the NSImageView. For testing I then colored each NSView to the color stored in the variable before (so to that color where the NSView is positioned on the NSImageView). How you can see in the screenshots for example I displayed a 300x300 image containing four different colors in the NSImageView. The colors will be detected right, but it seems like the colors are swapped horizontally. The colors on the top will be measured when the NSViews are on the bottom and the colors on the bottom will be measured when the NSViews are on the top.

Is the byte order wrong? How I can swap this? I already read How do I get the color of a pixel in a UIImage with Swift? and Why do I get the wrong color of a pixel with following code?. Thats from where I have the code I use which I changed a little bit:

func setDefaulColor(image: NSImage)
{
    let posX = self.frame.origin.x + (self.frame.width / 2)
    let posY = self.frame.origin.y + (self.frame.height / 2)
    
    var r: CGFloat = 0
    var g: CGFloat = 0
    var b: CGFloat = 0
    var a: CGFloat = 1

    if posX <= image.size.width && posY <= image.size.height
    {
        var imageRect = CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height)
        let imageRef = image.cgImage(forProposedRect: &imageRect, context: nil, hints: nil)
        
        var pixelData = imageRef!.dataProvider!.data
        let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)
        
        let pixelInfo: Int = Int(posY) * imageRef!.bytesPerRow + Int(posX) * 4
             
        r = CGFloat(data[pixelInfo]) / CGFloat(255.0)
        g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
        b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
        a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)
        
    
    }
    
    self.defaultColor = CGColor(red: r, green: g, blue: b, alpha: a)
    setNeedsDisplay(NSRect(x: 0, y: 0, width: self.frame.width, height: self.frame.height))
}

Here are some screenshots:

NSViews on the top

NSViews on the bottom

The PNG-file displayed by the NSImageView should be in the RGBA format. How you can see I think the colors extracted right from the pixeldata:

    r = CGFloat(data[pixelInfo]) / CGFloat(255.0)
    g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
    b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
    a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)

but it seems the pixeldata is loaded in the wrong order?

Do you know how to change the data order or why this is happening?

J0n4s
  • 3
  • 2
  • Maybe this has something todo with the coordinatesystem on macOS? The code was written for an iOS app, on iOS the 0,0 point is on the top left corner, on macOS point 0,0 seems to be at the bottom left corner. How I manage this on macOS? – J0n4s Jul 01 '20 at 09:55

1 Answers1

0

The coordinate system of an image and coordinate system of your view are not the same. The conversion is needed between them.

It is hard to say how

let posX = self.frame.origin.x + (self.frame.width / 2)
let posY = self.frame.origin.y + (self.frame.height / 2)

relate to your image as you did not specify any additional information.

If you have an image view and you would like to extract a pixel at a certain position (x, y) then you need to take into consideration the scaling and content mode.

The image itself is usually placed in the byte buffer so that the top-left pixel is first and is followed by the pixel right of it. The coordinate system of NSView is not though as it starts on bottom left.

To begin with it makes most sense to get relative position. This is a point with coordinates within [0, 1]. For your view it should be:

func getRelativePositionInView(_ view: NSView, absolutePosition: (x: CGFloat, y: CGFloat)) -> (x: CGFloat, y: CGFloat) {
    return ((absolutePosition.x - view.frame.origin.x)/view.frame.width, (absolutePosition.y - view.frame.origin.y)/view.frame.height)
}

Now this point should be converted to image coordinate system where vertical flip needs be done and scaling applied.

If content mode is simply "scale" (whole image is shown) then the solution is simple:

func pointOnImage(_ image: NSImage, relativePositionInView: (x: CGFloat, y: CGFloat)) -> (x: CGFloat, y: CGFloat)? {
    let convertedCoordinates: (x: CGFloat, y: CGFloat) = (
        relativePositionInView.x * image.size.width,
        (1.0-relativePositionInView.y) * image.size.height
    )
    guard convertedCoordinates.x >= 0.0 else { return nil }
    guard convertedCoordinates.y >= 0.0 else { return nil }
    guard convertedCoordinates.x < image.size.width else { return nil }
    guard convertedCoordinates.y < image.size.height else { return nil }

    return convertedCoordinates
}

Some other more common modes are scale-aspect-fill and scale-aspect-fit. Those need extra computations when converting points but this seems to not be a part of your issue (for now).

So the two methods will most likely fix your issue. But you can also just apply a very short fix:

let posY = whateverViewTheImageIsOn.frame.height - (self.frame.origin.y + (self.frame.height / 2))

Personally I think this to be very messy but you be the judge of that.

There are also some other considerations which may or may not be valid for your case. When displaying an image the color of pixels may appear different than they are in your buffer. Mostly this is due to scaling. For instance a pure black and white image may show gray areas on some pixels. If this is something you would like to apply when finding a color it makes more sense to look into creating an image from NSView. This approach could also remove a lot of mathematical problems for you.

Matic Oblak
  • 16,318
  • 3
  • 24
  • 43