0

In my (Android OpenGL ES-based) app, I need to render a complex, self-obstructing surface with DEPTH_TEST and BLEND enabled:

GLES20.glEnable (GLES20.GL_DEPTH_TEST);
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);

A video of an example application rendering a fully opaque (alpha-1.0 everywhere) flat surface with a bubble growing out of it can be seen here:

https://www.youtube.com/watch?v=oIpVoiFUvyw

As you can see, when the bubble reaches maximum height, its top obstructs parts of the surface below it. If you look closely, you will see that only the bottom half of the bubble gets rendered correctly, and the bubble's top half ends up mostly transparent with the surface below being opaque, which is visually incorrect.

Now, I think I know why this happens: it is because of the order the fragments get rendered to the screen. The fragments must get drawn from bottom to top, so in case of the bottom half, the background below gets rendered first and the whole thing looks correct; however in case of the top half, it is the bubble that gets rendered first, and the background below it after, thus the fragments of the background actually end up being the SOURCE in the

GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);

equation, thus they end up opaque (even though they are further from the camera and should have been obstructed by the bubble).

So I know what happens, but I don't know how to fix it. Looks like when BLEND is switched on, the fragments further from the camera can still end up obstructing the fragments that are closer, even though DEPTH_TEST is enabled.

One way to fix it would be to switch off BLENDing, and that indeed makes the bubble get rendered correctly. This however I cannot do (or I think I cannot?), because there's an additional requirement: it is possible that the (possibly self-obstructing) surface gets rendered on top or another surface, and parts of it can be semi-transparent in which case I DO want to blend them with the surface below.

So, in essence, I want to be able to render a surface so that:

  1. it DOES blend with whatever is in the framebuffer BEFORE it started getting rendered
  2. it DOES NOT blend with itself.

Any tips?

user3256930
  • 1,123
  • 7
  • 12
Leszek
  • 1,181
  • 1
  • 10
  • 21
  • 2
    There's ping-pong rendering techniques with fbo... – j-p Jan 26 '15 at 16:41
  • Actually reading myself I realized that what I want is the equivalent of glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA), but where the Source is always the fragment that is closer to the camera (has lower z-value). This way blending would be correct regardless of the order fragments get rendered. Really it seems to me this would be an obvious functionality to have and something OpenGL hardware could implement very cheaply, but after searching for it, it looks like OpenGL simply does not support it... – Leszek Jan 26 '15 at 22:47
  • 1
    It doesn't support it very directly. There are approaches, like depth peeling. I wrote an overview of some common transparency rendering approaches in an answer here: http://stackoverflow.com/questions/23280692/opengl-es2-alpha-test-problems/23283256#23283256. – Reto Koradi Jan 27 '15 at 08:18

0 Answers0