2

I am attempting to place a number of overlays (textures) on top of an existing texture. For the most part, this works fine.

However, for the life of me, I can't figure out why the output of this is sporadically "flickering" in my drawRect method of my MTKView. Everything seems fine; I do further processing on theTexture (in a kernel shader) after I loop with my placing my overlays. For some reason, I feel like this encoding is ending early and not enough work is getting done on it.

To clarify, everything starts out fine but about 5 seconds in, the flickering starts and gets progressively worse. For debugging purposes (right now, anyways) that loop runs only once -- there is only one overlay element. The input texture (theTexture) is bona-fide every time before I start (created with a descriptor where storageMode is MTLStorageModeManaged and usage is MTLTextureUsageUnknown).

I've also tried stuffing the encoder instantiation/ending inside the loop; no difference.

Can someone help me see what I'm doing wrong?

id<MTLTexture> theTexture; // valid input texture as "background"

MTLRenderPassDescriptor *myRenderPassDesc = [MTLRenderPassDescriptor renderPassDescriptor];
myRenderPassDesc.colorAttachments[0].texture = theTexture;
myRenderPassDesc.colorAttachments[0].storeAction = MTLStoreActionStore;
myRenderPassDesc.colorAttachments[0].loadAction = MTLLoadActionLoad;

id<MTLRenderCommandEncoder> myEncoder = [commandBuffer renderCommandEncoderWithDescriptor:myRenderPassDesc];
MTLViewport viewPort = {0.0, 0.0, 1920.0, 1080.0, -1.0, 1.0};
vector_uint2 imgSize = vector2((u_int32_t)1920,(u_int32_t)1080);
[myEncoder setViewport:viewPort];
[myEncoder setRenderPipelineState:metalVertexPipelineState];

for (OverlayWrapper *ow in overlays) {
    id<MTLTexture> overlayTexture = ow.overlayTexture;

    VertexRenderSet *v = [ow getOverlayVertexInfoPtr];
    NSUInteger vSize = v->metalVertexCount*sizeof(AAPLVertex);
    id<MTLBuffer> mBuff = [self.device newBufferWithBytes:v->metalVertices
                                                   length:vSize
                                                  options:MTLResourceStorageModeShared];

    [myEncoder setVertexBuffer:mBuff offset:0 atIndex:0];
    [myEncoder setVertexBytes:&imgSize length:sizeof(imgSize) atIndex:1];
    [myEncoder setFragmentTexture:overlayTexture atIndex:0];
    [myEncoder drawPrimitives:MTLPrimitiveTypeTriangle vertexStart:0 vertexCount:v->metalVertexCount];
}

[myEncoder endEncoding];

// do more work (kernel shader) with "theTexture"...


UPDATE #1: I've attached a image of a "good" frame, with the vertex area (lower right) being shown. My encoder is responsible for placing the green stand-in "image" on top of the video frame theTexture at 30fps, which it does do. Just to clarify, theTexture is created for each frame (from a CoreVideo pixel buffer). After this encoder, I only read from the theTexture in a kernel shader to adjust brightness -- all that is working just fine.

My problems must exist elsewhere, as the video frames stop flowing (though the audio keeps going) and I end up alternating between 2 or 3 previous frames once this encoder is inserted (hence, the flicker). I believe now that my video pixel buffer vendor is being inadvertently supplanted by this "overlay" vendor.

If I comment out this entire vertex renderer, my video frames flow through just fine; it's NOT a problem with my video frame vendor.

A sample "good" frame of video[1]

UPDATE #2:

Here is the declaration of my rendering pipeline:

MTLRenderPipelineDescriptor *p = [[MTLRenderPipelineDescriptor alloc] init];
if (!p)
    return nil;
p.label = @"Vertex Mapping Pipeline";
p.vertexFunction   = [metalLibrary newFunctionWithName:@"vertexShader"];
p.fragmentFunction = [metalLibrary newFunctionWithName:@"samplingShader"];
p.colorAttachments[0].pixelFormat = MTLPixelFormatBGRA8Unorm;

NSError *error;
metalVertexPipelineState = [self.device newRenderPipelineStateWithDescriptor:p
                                                                       error:&error];
if (error || !metalVertexPipelineState)
    return nil;

Here is the texture descriptor used for creation of theTexture:

metalTextureDescriptor = [MTLTextureDescriptor texture2DDescriptorWithPixelFormat:MTLPixelFormatBGRA8Unorm 
                                                                            width:width
                                                                           height:height
                                                                        mipmapped:NO];
metalTextureDescriptor.storageMode = MTLStorageModePrivate;
metalTextureDescriptor.usage = MTLTextureUsageUnknown;

I haven't included the AAPLVertex and the vertex/fragment functions because of this: If I just comment out the OverlayWrapper loop in my rendering code (ie. don't even set vertex buffers or draw primitives), the video frames still flicker. The video is still playing but only 2-3 frames or so are playing in a continuous loop, from the time that this encoder "ran".

I've also added this code after the [... endEncoding] and changed the texture usage to MTLStorageModeManaged -- still, no dice:

id<MTLBlitCommandEncoder> blitEncoder = [commandBuffer blitCommandEncoder];
[blitEncoder synchronizeResource:crossfadeOutput];
[blitEncoder endEncoding];

To clarify a few things: The subsequent computer shader uses theTexture for input only. These are video frames; thusly, theTexture is re-created each time. Before it goes through this render stage, it has a bona-fide "background"


UPDATE #3:

I got this working, if by unconventional means.

I used this vertex shader to render my overlay onto a transparent background of a newly-created blank texture, specifically with my loadAction being MTLLoadActionClear with a clearColor of (0,0,0,0).

I then mixed this resulting texture with my theTexture with a kernel shader. I should not have to do this, but it works!

zzyzy
  • 973
  • 6
  • 21
  • 1
    Perhaps add screenshots of a good frame and a bad frame. How was the pipeline state set up (i.e. what was the descriptor it was created from)? Show the vertex and fragment functions. Show the declaration of `AAPLVertex`. Does the compute shader just use `theTexture` as input or does it modify it? Why is that not relevant to show? It's not clear from what you wrote: do you recreate `theTexture` for every frame? Do you somehow fill it or clear it each time? Or did I get that wrong and you reuse it from one frame to next? If the latter, then all the drawing will accumulate. – Ken Thomases Sep 26 '17 at 23:48
  • Ken, I've added a couple of updates to clarify a few things. Hopefully these answer your questions. – zzyzy Sep 28 '17 at 04:59
  • I don't know how you're creating `theTexture` from video frames, but perhaps [this](https://stackoverflow.com/questions/43550769/holding-onto-a-mtltexture-from-a-cvimagebuffer-causes-stuttering) is related? – Ken Thomases Sep 28 '17 at 15:31
  • I'm doing precisely that same thing -- using CVMetalTextureCacheCreateTextureFromImage with AVPlayer output to get a CVMetalTexture and then CVMetalTextureGetTexture to get the MTLTexture. Seems to work fine until I introduce this new vertex encoder. I do hold a strong pointer to the CVPixelBuffer until the end of the render loop. – zzyzy Sep 28 '17 at 17:35
  • Introducing the render encoder with `theTexture` as a color attachment makes a strong reference to `theTexture` that lasts at least until the command buffer has been processed past that encoder's position in the buffer, possibly until the buffer has completed and is itself deallocated. Perhaps you have too many in flight and you're exhausting the pool of pixel buffers / textures. – Ken Thomases Sep 28 '17 at 18:17
  • This is about stage 6 or 7 of my rendering pipeline... can I safely nil out textures from previous stages as a means to mitigate the issue? I didn't think doing so would immediately percolate to the GPU, whether that texture was private or managed. – zzyzy Sep 29 '17 at 14:15
  • I got this working (see Update #3, above) even if it is a bit of kludge. – zzyzy Oct 03 '17 at 17:02
  • I stumbled upon this because I was having similar flickering problems when using `MTLLoadActionLoad` and `MTLStoreActionStore` when trying to preserve the contents of the framebuffer texture between frames (for incremental redraws). For me, it was due to the fact that there's a pool of drawables/textures used in a `CAMetalLayer` for double or triple buffering (and there's no way to force it to use a single drawable). So you'd need to keep all of the drawables in sync if you wanted a consistent set of frames, as well as making sure they're all initialized/cleared correctly to begin with. – bitjeep May 22 '19 at 15:59

1 Answers1

2

I had the same problem and wanted to explore a simpler solution before attempting @zzyzy's. This solution is also somewhat unsatisfying but at least seems to work.

The key (but inadequate in and of itself) is to reduce the buffering on the Metal layer:

metalLayer_.maximumDrawableCount = 2

Second, once the buffering was reduced, I found I had to go through a render/present/commit cycle to draw a trivial, invisible item with .clear set on the render pass descriptor — pretty straightforward:

renderPassDescriptor.colorAttachments[0].loadAction = .clear

(That there were a few invisible triangles drawn is probably irrelevant; it is probably the MTLLoadActionClear attribute that differentiates the pass. I used the same clear color as @zzyzy above and I think this echos the above solution.)

Third, I found I had to run the code through that render/present/commit cycle a second time — i.e., twice in a row. Of the three, this seems the most arbitrary and I don't pretend to understand it, but the three together worked for me.

Cope
  • 769
  • 7
  • 17
  • 1
    Reducing the max number of command buffers to 1 (instead of say, 3) was the fix for me. It would seem the texture I was reading from and writing to was being overwritten in the next loop before the previous loop had finished resulting in a flicker. I agree the solution i unsatisfying, but in my case I don't need high performance so it works for me. An alternative would be not to store the texture (i.e. recreate each time) - though that also sounds like it could be expensive. – JCutting8 Jul 04 '20 at 05:17