4

I'm trying to figure out how setColor works. I have the following code:

    
    
    lazy var imageView:NSImageView = {
        let imageView = NSImageView(frame: view.frame)
        return imageView
    }()
    
    override func viewDidLoad() {
        super.viewDidLoad()
        createColorProjection()
        view.wantsLayer = true
        view.addSubview(imageView)
        
        view.needsDisplay = true
    }
    
    func createColorProjection() {
        var bitmap = NSBitmapImageRep(cgImage: cgImage!)
        var x = 0
        while x < bitmap.pixelsWide {
            var y = 0
            while y < bitmap.pixelsHigh {
                //pixels[Point(x: x, y: y)] = (getColor(x: x, y: y, bitmap: bitmap))
                bitmap.setColor(NSColor(cgColor: .black)!, atX: x, y: y)
                y += 1
            }
            x += 1
        }
        
        let image = createImage(bitmap: bitmap)
        imageView.image = image
        imageView.needsDisplay = true
    }
    
    
    func createImage(bitmap:NSBitmapImageRep) -> NSImage {
        let image = bitmap.cgImage
        return NSImage(cgImage: image! , size: CGSize(width: image!.width, height: image!.height))
    }

The intention of the code is to change a photo (a rainbow) to be entirely black (I'm just testing with black right now to make sure I understand how it works). However, when I run the program, the unchanged picture of the rainbow is shown, not a black photo.

I am getting these errors: Unrecognized colorspace number -1 and Unknown number of components for colorspace model -1.

Thanks.

Ben A.
  • 874
  • 7
  • 23
  • It looks like this was an XY problem all along. Changing reading, modifying and writing-back individual pixel values is probably not the way to achieve what you're trying to achieve. This will use tons of CPU and memory, because bitmapped images get really big really fast. You should probably be using the `CIFilter` APIs to [make your own filter](https://developer.apple.com/library/archive/documentation/GraphicsImaging/Conceptual/CoreImaging/ci_custom_filters/ci_custom_filters.html) which does this. – Alexander Nov 03 '21 at 17:10

1 Answers1

4

First, you're right: setColor has been broken at least since Catalina. Apple hasn't fixed it probably because it's so slow and inefficient, and nobody ever used it.

Second, docs say NSBitmapImageRep(cgImage: CGImage) produces a read-only bitmap so your code wouldn't have worked even if setColor worked.

As Alexander says, making your own CIFilter is the best way to change a photo's pixels to different colors. Writing and implementing the OpenGL isn't easy, but it's the best.

If you were to add an extension to NSBitmapImageRep like this:

extension NSBitmapImageRep {
    func setColorNew(_ color: NSColor, atX x: Int, y: Int) {
        guard let data = bitmapData else { return }
        
        let ptr = data + bytesPerRow * y + samplesPerPixel * x
        
        ptr[0] = UInt8(color.redComponent * 255.1)
        ptr[1] = UInt8(color.greenComponent * 255.1)
        ptr[2] = UInt8(color.blueComponent * 255.1)
        
        if samplesPerPixel > 3 {
            ptr[3] = UInt8(color.alphaComponent * 255.1)
        }
    }
}

Then simply changing an image's pixels could be done like this:

func changePixels(image: NSImage, newColor: NSColor) -> NSImage {
    guard let imgData = image.tiffRepresentation,
          let bitmap = NSBitmapImageRep(data: imgData),
          let color = newColor.usingColorSpace(.deviceRGB)
    else { return image }
    
    var y = 0
    while y < bitmap.pixelsHigh {
        var x = 0
        while x < bitmap.pixelsWide {
            bitmap.setColorNew(color, atX: x, y: y)
            x += 1
        }
        y += 1
    }
    
    let newImage = NSImage(size: image.size)
    newImage.addRepresentation(bitmap)
    
    return newImage
}
Loengard
  • 401
  • 2
  • 7
  • Why 255.1 and not 255.0? – Cœur Mar 13 '23 at 23:08
  • The components of NSColor range from CGFloat(0) ... CGFloat(1), and the bytes of ptr range from UInt8(0) ... UInt8(255). The multiplier used to convert CGFloat to UInt8 must be greater than 255 so that once multiplied and truncated by casting to UInt8 you'll be left with the proper value. For instance, 253 / 255 is 0.992156862745098, multiplied by 255 is 252.99999..., and truncated to UInt8 is 252 which is not the original value. However, multiplied by 255.1 is 253.0992..., truncated is 253. – Loengard Mar 14 '23 at 11:26
  • I would have preferred a `round(color.redComponent * 255.0)` or a `color.redComponent * 255.0 + 0.5`, for clarity on rounding matters. – Cœur Mar 15 '23 at 02:08
  • I just tried in Obj-C `for (int i = 0; i <= 255; i++) { CGFloat a = i / (CGFloat)255.0; assert(uint8(a * 255) == i); }` and in Swift `for i in 0...255 { let a = CGFloat(i) / 255.0; assert(UInt8(a * 255) == i); }` and the test pass for all values. There are no issues for any float values in the range 0-255. – Cœur Mar 15 '23 at 02:45
  • You're right, the modern version of Swift/Foundation handles these edge cases properly, but not all systems do. Like in Numbers: Set A1 to "=18/255", set A2 to "=Int(A1*255)". A2 shows 17, not 18, because it's truncating 17.9999.... – Loengard Mar 17 '23 at 21:58