I'm working on an iPad app, with OpenFrameworks and OpenGL ES 1.1. I need to display a video with alpha channel. To simulate it i have a RGB video (without any alpha channel) and another video containing only alpha channel (on every RGB channel, so the white parts correspond to the visible parts and the black to the invisible). Every video is an OpenGL texture.
In OpenGL ES 1.1 there is no shader, so i found this solution (here : OpenGL - mask with multiple textures) :
glEnable(GL_BLEND);
// Use a simple blendfunc for drawing the background
glBlendFunc(GL_ONE, GL_ZERO);
// Draw entire background without masking
drawQuad(backgroundTexture);
// Next, we want a blendfunc that doesn't change the color of any pixels,
// but rather replaces the framebuffer alpha values with values based
// on the whiteness of the mask. In other words, if a pixel is white in the mask,
// then the corresponding framebuffer pixel's alpha will be set to 1.
glBlendFuncSeparate(GL_ZERO, GL_ONE, GL_SRC_COLOR, GL_ZERO);
// Now "draw" the mask (again, this doesn't produce a visible result, it just
// changes the alpha values in the framebuffer)
drawQuad(maskTexture);
// Finally, we want a blendfunc that makes the foreground visible only in
// areas with high alpha.
glBlendFunc(GL_DST_ALPHA, GL_ONE_MINUS_DST_ALPHA);
drawQuad(foregroundTexture);
It's exactly what i want to do but glBlendFuncSeparate() doesn't exist in OpenGL ES 1.1 (or on iOS). I'm trying to do it with glColorMask and i found this : Can't get masking to work correctly with OpenGL
But it doesn't work as well, i guess because his mask texture file contains an 'real' alpha channel, and not mine.