3

I have the following function to create a MTL texture;

let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
let bytesPerPixel = Int(4)
let bitsPerComponent = Int(8)
let bitsPerPixel:Int = 32
var textureSizeX:Int = 1200
var textureSizeY:Int = 1200

func setupTexture() {

    var rawData0 = [UInt8](repeating: 0, count: Int(textureSizeX) * Int(textureSizeY) * 4)

    let bytesPerRow = 4 * Int(textureSizeX)
    let bitmapInfo = CGBitmapInfo.byteOrder32Big.rawValue | CGImageAlphaInfo.premultipliedLast.rawValue

    let context = CGContext(data: &rawData0, width: Int(textureSizeX), height: Int(textureSizeY), bitsPerComponent: bitsPerComponent, bytesPerRow: bytesPerRow, space: rgbColorSpace, bitmapInfo: bitmapInfo)!
    context.setFillColor(UIColor.black.cgColor)
    context.fill(CGRect(x: 0, y: 0, width: CGFloat(textureSizeX), height: CGFloat(textureSizeY)))

    let textureDescriptor = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: MTLPixelFormat.rgba8Unorm, width: Int(textureSizeX), height: Int(textureSizeY), mipmapped: false)

    textureDescriptor.usage = MTLTextureUsage(rawValue: MTLTextureUsage.renderTarget.rawValue | MTLTextureUsage.shaderRead.rawValue)

    let textureA = device.makeTexture(descriptor: textureDescriptor)

    let region = MTLRegionMake2D(0, 0, Int(textureSizeX), Int(textureSizeY))
    textureA?.replace(region: region, mipmapLevel: 0, withBytes: &rawData0, bytesPerRow: Int(bytesPerRow))

    offscreenTexture = textureA

}

How do I change this function so it produces a texture that can handle full 32bit colour depth? The colours this texture produces are too limited for my needs.

Further explanation:

The reason I'm asking this is because of the following.

I am rendering a shader (the orange glowing dots you see below) to texture using the SKRenderer, which is applied to an SCNMaterial.

When I render the shader normally in a SpriteKit scene and apply it as a texture for an SCNMaterial, I get the resulting rich orange look I want as seen in the TOP image.

However when I process the image using the SKRenderer method, the resuts come out looking like the BOTTOM image, where the colours look more yellow, diluted down and not as rich.

enter image description here

enter image description here

I put this down to the fact that I'm creating the MTLTexture without enough colour accuracy. I don't understand enough about texture creation to know what I might be doing wrong, but I believe that there is a way to increase the colour richness/accuracy. How??

Geoff H
  • 3,107
  • 1
  • 28
  • 53
  • This is using 32 bits per pixel, which is what "32bit colour depth" usually means. Did you want 32 bits per *component*? And in what way are "[t]he colours this texture produces […] too limited for [your] needs"? What do you mean by that? Explain something you need that this doesn't satisfy. – Ken Thomases Dec 29 '18 at 21:38
  • I need more accurate colour, which as I understand it would mean a setup that uses rgba32Uint instead of rgba8Unorm. Do you know how @Ken Thomases? – Geoff H Dec 29 '18 at 21:43
  • Well, for the texture descriptor, use `MTLPixelFormat.rgba32UInt`. The issue then is how to fill it with black. Don't attempt to use a `CGContext` filled with black. Core Graphics doesn't support 32 bits per component on iOS. (Even on macOS, only float components are supported for 32bpc.) Just create an array with the correct number of `UInt32`s with values 0, 0, 0, `UInt32.max` for each pixel. (If the black can be transparent, you can use all zeroes.) – Ken Thomases Dec 29 '18 at 22:02
  • It's obvious that you know what you're talking about, I can hold my own on iOS but creating textures is outside my comfort zone. So manifesting what you said there into something of practical use is a reach too far. Perhaps I don't need to go as accurate as rgba32UInt to get the accuracy I need. But I know that rgba8Unorm isn't enough. How far can you go & still have CoreGraphics support? Any chance I could humbly ask you to provide a code snippet? I'd even offer to mark your answer as accepted, not that you need the points :) – Geoff H Dec 29 '18 at 22:14
  • Geoff, could you back up and explain what you want to accomplish and why a 32bit pixel is not precise enough to do that? Do you really need full float32 values for each of the 4 components? That is going to consume a lot of memory for large images. – MoDJ Dec 29 '18 at 23:08
  • @MoDJ Ok, further explaination added. Please help if you can.. – Geoff H Dec 29 '18 at 23:29
  • 1
    So, based on the visual result you are going for, I would suggest that you make sure the rendering path you are using is not running into a problem where sRGB (gamma adjusted) values are being clamped down to linear 8 bit values stored as BGRA. I cannot be sure based on the complex render path you describe, but if this type of clamping is going on it would produce the result you are seeing. If this is the issue, it is possible that you can still get the results you are looking for with 8 bit pixel outputs, but you need to make sure all your render steps properly output to sRGB textures. – MoDJ Dec 29 '18 at 23:57
  • 1
    A real world example of what I mean by clamping a sRGB as a linear BGRA can be found in this small github example that renders H.264 under iOS with Metal. Note that at commit a450b5046db88f3f8262c630c9c27fab2ed91554 the iOS version properly renders into a SRGB Metal texture but the MacOSX version renders into a clamped BGRA texture. The texture allocation logic is in AAPLRenderer.m. https://github.com/mdejong/MetalBT709Decoder – MoDJ Dec 30 '18 at 00:20
  • Have tried creating an sRGB colourspace and creating a texture descriptor with a pixel format of MTLPixelFormat.rgba8Unorm_srgb, but still no joy. None of these settings seems to be having any effect on the resulting colour. I found this post which appears to be related: https://stackoverflow.com/questions/49564889/mtktextureloader-saturates-image Any more ideas? – Geoff H Jan 01 '19 at 20:33

0 Answers0