I've been using MTKTextureLoader
to load user provided images in to textures for rendering. I am rendering these provided textures to an intermediate texture, and then rendering the intermediate texture to the MTKView
drawable. Both the intermediate texture and the drawable have the same color format.
I've run in to some problems with certain images. All of the images are PNG files, but it seems like I can get different underlying data from the MTKTextureLoader
.
First issue:
I load a PNG with alpha and without alpha. This seems to be the factor, but that's not 100% clear. Both texture properties appear to be the same.
PNG with alpha:
Texture: <BronzeMtlTexture: 0x1015484b0>
label = 512x512.png
textureType = MTLTextureType2D
pixelFormat = MTLPixelFormatBGRA8Unorm_sRGB
width = 512
height = 512
depth = 1
arrayLength = 1
mipmapLevelCount = 10
sampleCount = 1
cpuCacheMode = MTLCPUCacheModeDefaultCache
storageMode = MTLStorageModeManaged
resourceOptions = MTLResourceCPUCacheModeDefaultCache MTLResourceStorageModeManaged
usage = MTLTextureUsageShaderRead
framebufferOnly = 0
purgeableState = MTLPurgeableStateNonVolatile
parentTexture = <null>
parentRelativeLevel = 0
parentRelativeSlice = 0
buffer = <null>
bufferOffset = 0
bufferBytesPerRow = 0
iosurface = 0x0
iosurfacePlane = 0
label = 512x512.png
PNG without alpha:
Texture: <BronzeMtlTexture: 0x10164a9b0>
label = 016 - jKsgTpt.png
textureType = MTLTextureType2D
pixelFormat = MTLPixelFormatBGRA8Unorm_sRGB
width = 1685
height = 815
depth = 1
arrayLength = 1
mipmapLevelCount = 11
sampleCount = 1
cpuCacheMode = MTLCPUCacheModeDefaultCache
storageMode = MTLStorageModeManaged
resourceOptions = MTLResourceCPUCacheModeDefaultCache MTLResourceStorageModeManaged
usage = MTLTextureUsageShaderRead
framebufferOnly = 0
purgeableState = MTLPurgeableStateNonVolatile
parentTexture = <null>
parentRelativeLevel = 0
parentRelativeSlice = 0
buffer = <null>
bufferOffset = 0
bufferBytesPerRow = 0
iosurface = 0x0
iosurfacePlane = 0
label = 016 - jKsgTpt.png
In the above case, the PNG with alpha gets loaded with it's R & B components swapped. Is there a way to detect this so I can properly adjust the shader as needed?
Second issue:
One of the PNGs I was testing with ended loading as MTLPixelFormatRGBA16Unorm
. My intermediate texture and MTKView
drawable are usually MTLPixelFormatBGRA8Unorm
. This is detectable, but how would I properly render this texture to the intermediate texture? I'm getting a very blown out picture in this instance.
I feel like I'm missing some of the nuances of the MTKTextureLoader
, or that maybe this wasn't meant to be used the way I want to use it.
Update 1
I'm not doing anything special with the texture loader. There isn't much to configure:
let textureLoader = MTKTextureLoader(device: metalDevice)
let options: [MTKTextureLoader.Option:Any] = [
.generateMipmaps : true,
.SRGB: true
]
textureLoader.newTexture(URL: url, options: options) { (texture, error) in
// Store the texture here
}
As showing in the first issue, I'll get two different textures that are marked as BGRA8, but typically ones with transparency seem to have their pixels in RGBA order. In the second issue, I have one specific PNG that loads in RGBA16.
Update 2
Pipeline setup:
let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.vertexFunction = self.library.makeFunction(name: "instance_vertex")
pipelineDescriptor.fragmentFunction = self.library.makeFunction(name: "instance_fragment")
pipelineDescriptor.colorAttachments[0].pixelFormat = newTexture.pixelFormat
pipelineDescriptor.colorAttachments[0].isBlendingEnabled = true
pipelineDescriptor.colorAttachments[0].rgbBlendOperation = .add
pipelineDescriptor.colorAttachments[0].alphaBlendOperation = .add
pipelineDescriptor.colorAttachments[0].sourceRGBBlendFactor = .sourceAlpha
pipelineDescriptor.colorAttachments[0].sourceAlphaBlendFactor = .sourceAlpha
pipelineDescriptor.colorAttachments[0].destinationRGBBlendFactor = .oneMinusSourceAlpha
pipelineDescriptor.colorAttachments[0].destinationAlphaBlendFactor = .oneMinusSourceAlpha
newTexture
in this case is the texture loaded from the MTKTextureLoader
.
Render pass set up:
let renderPassDescriptor = MTLRenderPassDescriptor()
renderPassDescriptor.colorAttachments[0].texture = canvasTexture
renderPassDescriptor.colorAttachments[0].loadAction = .clear
renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(
red: Double(red),
green: Double(green),
blue: Double(blue),
alpha: Double(alpha)
)
renderPassDescriptor.colorAttachments[0].storeAction = .store
let encoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor)!
canvasTexture
was made with the same texture type as the MTKView
. I've tried BGRA8 and BGRA8 SRGB, depending on the loader flag for SRGB being set above in the loader.
The render:
encoder.setRenderPipelineState(pipelineState)
encoder.setVertexBuffer(vertexBuffer, offset: 0, index: 0)
encoder.setVertexBuffer(uniformBuffer, offset: memorySize * offset, index: 1)
encoder.setFragmentTexture(newTexture, index: 0)
encoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
The fragment shader:
fragment half4 face_instance_fragment(VertexOut v [[stage_in]], texture2d<float, access::sample> texture [[ texture(0) ]])
{
constexpr sampler textureSampler(mag_filter::linear,
min_filter::linear,
s_address::clamp_to_edge,
t_address::clamp_to_edge,
r_address::clamp_to_edge);
return (half4)texture.sample(textureSampler, v.texturePosition);
}
Adding .zyxw
to the sampler above will fix the colors of the one texture, but break the other, which is how I know the colors are correct, just in the wrong order.