2

I have an MTKView set to use MTLPixelFormat.rgba16Float. I'm having display issues which can be best described with the following graphic:

enter image description here

So the intended UIColor becomes washed out, but only while it is being displayed in MTKView. When I convert the drawable texture back to an image for display in a UIView via CIIMage, I get back the original color. Here is how I create that output:

let colorSpace = CGColorSpaceCreateDeviceRGB()
let kciOptions = [kCIImageColorSpace: colorSpace,
                  kCIContextOutputPremultiplied: true,
                  kCIContextUseSoftwareRenderer: false] as [String : Any]  
let strokeCIImage = CIImage(mtlTexture: metalTextureComposite, options: kciOptions)!.oriented(CGImagePropertyOrientation.downMirrored)
let imageCropCG = cicontext.createCGImage(strokeCIImage, from: bbox, format: kCIFormatABGR8, colorSpace: colorSpace) 

Other pertinent settings:

uiColorBrushDefault: UIColor = UIColor(red: 0.92, green: 0.79, blue: 0.18, alpha: 1.0)
self.colorPixelFormat = MTLPixelFormat.rgba16Float
renderPipelineDescriptor.colorAttachments[0].pixelFormat = self.colorPixelFormat

// below is the colorspace for the texture which is tinted with UIColor
let colorSpace = CGColorSpaceCreateDeviceRGB()
let texDescriptor = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: MTLPixelFormat.rgba8Unorm, width: Int(width), height: Int(height), mipmapped: isMipmaped)
target = texDescriptor.textureType
texture = device.makeTexture(descriptor: texDescriptor)

Some posts have hinted at sRGB being assumed somewhere, but no specifics as to how I can disable it.

I'd like the color that I display on MTKView to match the input (as close to it as possible anyway) and still be able to convert that texture into something I can display in an ImageView. I've tested this on an iPad Air and a new iPad Pro. Same behavior. Any help would be appreciated.

Plutovman
  • 677
  • 5
  • 22
  • 1
    So, first thing you should do is remove CoreImage from your processing chain to make sure that you are actually getting the correct color values into the MKTView. One simple way to do that is to use a hard coded clear color directly via Metal in the MKTView. I would suggest that you first get it working with sRGB then try to convert working code over to support linear 16 bit color. – MoDJ Mar 27 '19 at 21:19
  • I think `MTKView` internally does colorspace conversion to colorspace set in `colorspace` property, so you aren't getting a resull you want. – JustSomeGuy Mar 28 '19 at 06:41
  • I do not know swift, but usually similar problems are due: colour space: Imageview probably will show data as sRGB (more sensible choice, if there is no colour profile), but you select `CGColorSpaceCreateDeviceRGB`, which now [modern machines] can be DCI-P3 instead of sRGB. Second: linear / vs gamma corrected. Floats is often used on linear spaces. – Giacomo Catenazzi Mar 28 '19 at 12:48
  • Thank you@MoDJ. Very sensible suggestions. Taking CoreImage out of the equation just leaves me with the *brighter* color. It sounds then that I need to feed *corrected* UIColors that, when displayed, will match the original...Do you know what kind of correction I should apply? (math-wise) – Plutovman Mar 28 '19 at 18:57
  • 1
    sRGB colors are gamma encoded byte range values, when you define a color using linear 16 bit values then these colors are not gamma encoded (since they are linear). This is likely the cause of the issue you are seeing. What you should do is make sure you indicate colors as sRGB and then write these sRGB values into a texture that makes use of 16 bit pixel storage. Metal does the conversion for you typically. – MoDJ Mar 28 '19 at 20:22
  • So, the color I've defined is a UIColor, which, for iOS 10 or later, is natively defined in an extended range sRGB color space. I tried using rgba16Float for MTLTextureDescriptor, but it doesn't allow it, so I defaulted to rgba8Unorm. I think the discussions say that Metal should to the conversion for you. It feels like I'm getting closer, but the subtleties of colorspace interpretation are getting in the way. – Plutovman Mar 28 '19 at 21:01
  • The best performance comes from writing to a sRGB texture as opposed to a 16 bit float. Typically, writing 16 bit pixels takes roughly 2x as long as writing 8 bit pixels. You should try using MTLPixelFormatBGRA8Unorm_sRGB as the Metal pixel format to enable that. – MoDJ Mar 28 '19 at 21:25
  • Just tried that. Strangely, I don't see a perceptible difference, or one that gets me closer to the intended UIColor. I'll double-check my code. I do appreciate your feedback. – Plutovman Mar 28 '19 at 21:55
  • I posted a solution below. Thanks to all for helping me think through this. – Plutovman Mar 29 '19 at 18:44

2 Answers2

2

So, it looks like you are very close to the complete solution. But what you have is not quite correct. Here is a Metal function that will convert from sRGB to a linear value which you can then write in your Metal shader (I still suggest that you write to a sRGB texture but you can also write to a 16 bit texture). Note that sRGB is not a simple 2.2 gamma curve.

// Convert a non-linear log value to a linear value.
// Note that normV must be normalized in the range [0.0 1.0].

static inline
float sRGB_nonLinearNormToLinear(float normV)
{
  if (normV <= 0.04045f) {
    normV *= (1.0f / 12.92f);
  } else {
    const float a = 0.055f;
    const float gamma = 2.4f;
    //const float gamma = 1.0f / (1.0f / 2.4f);
    normV = (normV + a) * (1.0f / (1.0f + a));
    normV = pow(normV, gamma);
  }

  return normV;
}
MoDJ
  • 4,309
  • 2
  • 30
  • 65
  • Thank you @MoDJ! Just as a test, I've implemented a version of this function on the swift side of things, as well as changed the texture I write to MTLPixelFormat.bgra8Unorm_srgb . I'm not seeing any obvious differences from the outset, but I haven't done exhaustive testing. I'm assuming this linearization needs to be applied to r,g,b,a individually, correct? – Plutovman Mar 30 '19 at 19:48
  • Yes, R, G, B need to each be sent through the non-linear to linear function. If you are only converting the values for one pixel, then runtime performance will not be an issue. If you are converting all pixels in an image, that should be done in Metal for performance reasons. More background on where this function came from: https://stackoverflow.com/questions/53911662/does-h-264-encoded-video-with-bt-709-matrix-include-any-gamma-adjustment – MoDJ Mar 30 '19 at 22:25
  • Just to clarify, R,G,B ONLY? If I don't apply to A, then my opacity settings don't seem quite right. – Plutovman Mar 30 '19 at 23:00
  • Yes, you originally described an opaque color so I assume that is what you are still doing. You should not also attempt to mix variable opacity (alpha channel) pre-multiplication into this, you already have enough complexity to deal with. Leave alpha 1.0 in metal or 0xFF when dealing with byte values. – MoDJ Mar 31 '19 at 05:04
  • You are right about the complexity...Roughly, my app works like this: I pick a color, assign opacity based on pencil pressure, pass RGBA to a vertex shader that then multiplies that color by a stamp texture (with alpha). The result being a (hopefully) elegantly textured brushstroke...Incidentally, I did notice that setting MTLTextureDescriptor to bgra8Unorm_srgb causes edge ringing in my png stamp textures (which get multiplied by the color in question). This artifact goes away once I switch back to bgra8Unorm. I'll keep tinkering...Thanks, and I appreciate all your insights. – Plutovman Mar 31 '19 at 18:14
  • Geesh pluto, you are packing literally every feature into this. If you are using the alpha channel then you will need to also explicitly multiply the alpha value by the R, G, B channel values after the gamma correction and then save these so that the data is the correct pre-multiplied format. You can find working code for this in the MetalBT709 project linked from that SO question I referenced above. – MoDJ Mar 31 '19 at 18:21
0

The key turned out to be in undoing the gamma correction that is inherently embedded in a UIColor

let colorSRGB = UIColor(red: 0.92, green: 0.79, blue: 0.18, alpha: 1.0)
let rgbaSRGB = colorSRGB.getRGBAComponents()!
let gammapower : CGFloat = 2.2

let r = pow(rgbaSRGB.red, gammapower)
let g = pow(rgbaSRGB.green, gammapower)
let b = pow(rgbaSRGB.blue, gammapower)
let a = pow(rgbaSRGB.alpha, gammapower)

let colorNoGamma: UIColor = UIColor(red: CGFloat(r), green: CGFloat(g), blue: CGFloat(b), alpha: CGFloat(a))

Once I pass colorNoGamma to be applied in an MKTView with MTLPixelFormat.rgba16Float, the results will match a UIView displaying colorSRGB. It makes sense once you think through it...Thanks to @MoDJ for leading me in the right path.

Note that, on the MTKView side, I can keep the texture settings as originally defined, namely:

let texDescriptor = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: MTLPixelFormat.rgba8Unorm, width: Int(width), height: Int(height), mipmapped: isMipmaped)
target = texDescriptor.textureType

And, if I want to convert the currentDrawable into a texture to be displayed in a UIImageView, then I want to make sure I don't apply ANY colorspace settings. I.e:

let strokeCIImage = CIImage(mtlTexture: metalTextureComposite, options: [:])!.oriented(CGImagePropertyOrientation.downMirrored)
let imageCropCG = cicontext.createCGImage(strokeCIImage, from: box, format: kCIFormatABGR8, colorSpace: colorSpace)  // kCIFormatRGBA8 gives same result. Not sure what's proper
let layerStroke = CALayer()
layerStroke.frame = bbox
layerStroke.contents = imageCropCG

Note the argument options: [:] means I pass no colorspace/premultiplication/renderer settings as I did in the original post where I defined kciOptions

Plutovman
  • 677
  • 5
  • 22
  • Note that MTKView doesn't have to work in linear color space. When you set `rgba16Float` as your pixel format and don't specify a color space, Core Animation assumes your colors are in linear space when compositing your MTKView. This is why you were able to get your colors to match when you took your UIColor rgb values and converted them to linear space. If you explicitly set the color space of your CAMetalLayer to srgb, then your UIColor will appear the same as in your UIView without having to convert anything. – Ziconic Aug 05 '23 at 20:32
  • That is very good to know. I'm going to revisit my code and try your suggestion. Thanks! – Plutovman Aug 11 '23 at 23:40