0

I've been using MTKTextureLoader to load user provided images in to textures for rendering. I am rendering these provided textures to an intermediate texture, and then rendering the intermediate texture to the MTKView drawable. Both the intermediate texture and the drawable have the same color format.

I've run in to some problems with certain images. All of the images are PNG files, but it seems like I can get different underlying data from the MTKTextureLoader.

First issue:

I load a PNG with alpha and without alpha. This seems to be the factor, but that's not 100% clear. Both texture properties appear to be the same.

PNG with alpha:

Texture: <BronzeMtlTexture: 0x1015484b0>
    label = 512x512.png 
    textureType = MTLTextureType2D 
    pixelFormat = MTLPixelFormatBGRA8Unorm_sRGB 
    width = 512 
    height = 512 
    depth = 1 
    arrayLength = 1 
    mipmapLevelCount = 10 
    sampleCount = 1 
    cpuCacheMode = MTLCPUCacheModeDefaultCache 
    storageMode = MTLStorageModeManaged 
    resourceOptions = MTLResourceCPUCacheModeDefaultCache MTLResourceStorageModeManaged  
    usage = MTLTextureUsageShaderRead  
    framebufferOnly = 0 
    purgeableState = MTLPurgeableStateNonVolatile 
    parentTexture = <null> 
    parentRelativeLevel = 0 
    parentRelativeSlice = 0 
    buffer = <null> 
    bufferOffset = 0 
    bufferBytesPerRow = 0 
    iosurface = 0x0 
    iosurfacePlane = 0
    label = 512x512.png

PNG without alpha:

Texture: <BronzeMtlTexture: 0x10164a9b0>
    label = 016 - jKsgTpt.png 
    textureType = MTLTextureType2D 
    pixelFormat = MTLPixelFormatBGRA8Unorm_sRGB 
    width = 1685 
    height = 815 
    depth = 1 
    arrayLength = 1 
    mipmapLevelCount = 11 
    sampleCount = 1 
    cpuCacheMode = MTLCPUCacheModeDefaultCache 
    storageMode = MTLStorageModeManaged 
    resourceOptions = MTLResourceCPUCacheModeDefaultCache MTLResourceStorageModeManaged  
    usage = MTLTextureUsageShaderRead  
    framebufferOnly = 0 
    purgeableState = MTLPurgeableStateNonVolatile 
    parentTexture = <null> 
    parentRelativeLevel = 0 
    parentRelativeSlice = 0 
    buffer = <null> 
    bufferOffset = 0 
    bufferBytesPerRow = 0 
    iosurface = 0x0 
    iosurfacePlane = 0
    label = 016 - jKsgTpt.png

In the above case, the PNG with alpha gets loaded with it's R & B components swapped. Is there a way to detect this so I can properly adjust the shader as needed?

Second issue:

One of the PNGs I was testing with ended loading as MTLPixelFormatRGBA16Unorm. My intermediate texture and MTKView drawable are usually MTLPixelFormatBGRA8Unorm. This is detectable, but how would I properly render this texture to the intermediate texture? I'm getting a very blown out picture in this instance.


I feel like I'm missing some of the nuances of the MTKTextureLoader, or that maybe this wasn't meant to be used the way I want to use it.


Update 1

I'm not doing anything special with the texture loader. There isn't much to configure:

let textureLoader = MTKTextureLoader(device: metalDevice)

let options: [MTKTextureLoader.Option:Any] = [
    .generateMipmaps : true,
    .SRGB: true
]
    
textureLoader.newTexture(URL: url, options: options) { (texture, error) in
    // Store the texture here
}

As showing in the first issue, I'll get two different textures that are marked as BGRA8, but typically ones with transparency seem to have their pixels in RGBA order. In the second issue, I have one specific PNG that loads in RGBA16.


Update 2

Pipeline setup:

let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.vertexFunction = self.library.makeFunction(name: "instance_vertex")
pipelineDescriptor.fragmentFunction = self.library.makeFunction(name: "instance_fragment")
pipelineDescriptor.colorAttachments[0].pixelFormat = newTexture.pixelFormat
pipelineDescriptor.colorAttachments[0].isBlendingEnabled = true
pipelineDescriptor.colorAttachments[0].rgbBlendOperation = .add
pipelineDescriptor.colorAttachments[0].alphaBlendOperation = .add
pipelineDescriptor.colorAttachments[0].sourceRGBBlendFactor = .sourceAlpha
pipelineDescriptor.colorAttachments[0].sourceAlphaBlendFactor = .sourceAlpha
pipelineDescriptor.colorAttachments[0].destinationRGBBlendFactor = .oneMinusSourceAlpha
pipelineDescriptor.colorAttachments[0].destinationAlphaBlendFactor = .oneMinusSourceAlpha

newTexture in this case is the texture loaded from the MTKTextureLoader.

Render pass set up:

let renderPassDescriptor = MTLRenderPassDescriptor()
renderPassDescriptor.colorAttachments[0].texture = canvasTexture
renderPassDescriptor.colorAttachments[0].loadAction = .clear
renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(
    red: Double(red),
    green: Double(green),
    blue: Double(blue),
    alpha: Double(alpha)
)
renderPassDescriptor.colorAttachments[0].storeAction = .store
        
let encoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor)!

canvasTexture was made with the same texture type as the MTKView. I've tried BGRA8 and BGRA8 SRGB, depending on the loader flag for SRGB being set above in the loader.

The render:

encoder.setRenderPipelineState(pipelineState)
encoder.setVertexBuffer(vertexBuffer, offset: 0, index: 0)
encoder.setVertexBuffer(uniformBuffer, offset: memorySize * offset, index: 1)
encoder.setFragmentTexture(newTexture, index: 0)
encoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)

The fragment shader:

fragment half4 face_instance_fragment(VertexOut v [[stage_in]], texture2d<float, access::sample> texture [[ texture(0) ]])
{
    constexpr sampler textureSampler(mag_filter::linear,
                                     min_filter::linear,
                                     s_address::clamp_to_edge,
                                     t_address::clamp_to_edge,
                                     r_address::clamp_to_edge);
    
    return (half4)texture.sample(textureSampler, v.texturePosition);
}

Adding .zyxw to the sampler above will fix the colors of the one texture, but break the other, which is how I know the colors are correct, just in the wrong order.

user16217248
  • 3,119
  • 19
  • 19
  • 37
  • Are you ever accessing the textures as data (bytes)? In any case, edit your question to show the code where you load the PNGs and create textures from them, and anything else you do with the textures. – Ken Thomases Sep 10 '18 at 01:16
  • Try without `.SRGB: true`. Also, show the code which **uses** the textures. – Ken Thomases Sep 11 '18 at 01:11
  • I've tried with `.SRGB` as both `true` and `false`, adjusting the view and target texture format as needed. Same results. – Stephen H. Gerstacker Sep 11 '18 at 02:48
  • Just leave `.SRGB` out. If you specify it (either true or false), then you override the default interpretation of the image. Also, might as well declare the texture parameter in the fragment shader as `texture2d` since you're just casting to half, anyway. – Ken Thomases Sep 11 '18 at 02:52
  • There's been plenty of playing around to see how to fix this, so yeah, `.SRGB` has been all three states. The `half` thing was just me playing around at one point. – Stephen H. Gerstacker Sep 11 '18 at 13:14

1 Answers1

2

This is going to be difficult to answer without seeing the code (both app and shader) and getting specifics about what you're observing and how. For example, how are you determining that the PNG without alpha has its R and B components swapped?

In any case, shaders don't need to care about the component order of the pixel format. Reads/samples from a texture always return the R component in the .r component of the output, the G component in .g, the B component in .b, and alpha in .a, regardless of the underlying pixel format.

Likewise, shaders don't need to care about whether the texture's pixel format is sRGB or not. Shaders always work with linear RGBA. Metal does automatic conversions between sRGB textures and shader values.

The pixel format does affect what type of data is used for reads, samples, and writes. Normalized (signed or unsigned) pixel formats use half or float. Floating-point pixel formats also use half or float. Unsigned integer pixel formats use ushort or uint. Signed integer pixel formats use short or int. Depth (with or without stencil) pixel formats use float.

Ken Thomases
  • 88,520
  • 7
  • 116
  • 154
  • For issue one, one of the textures renders with the wrong colors. If I switch the sampler to return `zyxw`, if fixes the colors, so the texture loader is giving me two textures, both labeled as BGRA8, but the shader has to swap the R & B values to make one render properly, breaking the rendering of the other. – Stephen H. Gerstacker Sep 10 '18 at 01:11
  • If the texture format is r32Float, and I sample it, which component of the sample (xyzw) contains the sampled value? – akuz Apr 29 '22 at 09:36