I have a warp kernel, that I want to be able to return clear color for some coordinates, similar to a discardFragment in fragment shaders.
If I just return the coordinates that are outside the image extent it works as expected when rendered to a MTKView (I can see clear pixels). OTOH when rendering to a bitmap, the pixels that are supposed to be clear take the color of the nearest edge (similar to when sampling clampedToExtent).
There's an option to rewrite warp kernel into a generic CIKernel and return clear color from there, but we're loosing benefits of warp kernel simplicity and under-the-hood optimizations.
I can also think of this solution:
- Overlay input image over a larger black image
- Crop back to original extent after applying the effect
This would require to change the kernel logic to take extra edge size into account
Is there simpler, cleaner way to do this?