0

I'm working on an iPad app, with OpenFrameworks and OpenGL ES 1.1. I need to display a video with alpha channel. To simulate it i have a RGB video (without any alpha channel) and another video containing only alpha channel (on every RGB channel, so the white parts correspond to the visible parts and the black to the invisible). Every video is an OpenGL texture.

In OpenGL ES 1.1 there is no shader, so i found this solution (here : OpenGL - mask with multiple textures) :

glEnable(GL_BLEND);
// Use a simple blendfunc for drawing the background
glBlendFunc(GL_ONE, GL_ZERO);
// Draw entire background without masking
drawQuad(backgroundTexture);
// Next, we want a blendfunc that doesn't change the color of any pixels,
// but rather replaces the framebuffer alpha values with values based
// on the whiteness of the mask. In other words, if a pixel is white in the mask,
// then the corresponding framebuffer pixel's alpha will be set to 1.
glBlendFuncSeparate(GL_ZERO, GL_ONE, GL_SRC_COLOR, GL_ZERO);
// Now "draw" the mask (again, this doesn't produce a visible result, it just
// changes the alpha values in the framebuffer)
drawQuad(maskTexture);
// Finally, we want a blendfunc that makes the foreground visible only in
// areas with high alpha.
glBlendFunc(GL_DST_ALPHA, GL_ONE_MINUS_DST_ALPHA);
drawQuad(foregroundTexture);

It's exactly what i want to do but glBlendFuncSeparate() doesn't exist in OpenGL ES 1.1 (or on iOS). I'm trying to do it with glColorMask and i found this : Can't get masking to work correctly with OpenGL

But it doesn't work as well, i guess because his mask texture file contains an 'real' alpha channel, and not mine.

Community
  • 1
  • 1

1 Answers1

1

I highly suggest you compute a single RGBA texture instead.

This will be both easier, and faster ( because you're sending 2 RGBA textures each frame - yes, your RGB texture is in fact encoded in RGBA by the hardware, and the A is ignored )

glColorMask won't help you, because it simply says "turn on or off this channel completely".

glBlendFuncSeparate could help you if you had it, but again, it's not a good solution : you're ruining your (very limited) iphone bandwidth by sending twice as much data as needed.

UPDATE :

Since you're using OpenFrameworks, and according to its source code ( https://github.com/openframeworks/openFrameworks/blob/master/libs/openFrameworks/gl/ofTexture.cpp and https://github.com/openframeworks/openFrameworks/blob/master/libs/openFrameworks/video/ofVideoPlayer.cpp ) :

  • Use ofVideoPlayer::setUseTexture(false) so that ofVideoPlayer::update won't upload the data to video memory;
  • Get the video data with ofVideoPlayer::getPixels
  • Interleave the result in the RGBA texture (you can use an GL_RGBA ofTexture and ofTexture::loadData)
  • Draw using ofTexture::Draw ( this is what ofVideoPlayer does anyway )
Calvin1602
  • 9,413
  • 2
  • 44
  • 55
  • The problem is that my textures are videos and we choosed H.264 codec because files are much lighter than Animation codec files... And this will be the only 'heavy' process, so i guess it should work, even on iPad, don't you think ? – user1554162 Jul 26 '12 at 10:17
  • You have to send the video to OpenGL with glTexSubImage2D somewhere, right ? Or is the video decoded directly to video memory ? – Calvin1602 Jul 26 '12 at 10:19
  • I'm using OpenFrameworks class ofVideoPlayer : http://www.openframeworks.cc/documentation/video/ofVideoPlayer.html#draw. It loads in a movie file via quicktime, so i guess the video is decoded in the video memory, am i wrong ? – user1554162 Jul 26 '12 at 10:25
  • yes, see updated answer and https://github.com/openframeworks/openFrameworks/blob/master/libs/openFrameworks/video/ofVideoPlayer.cpp#L97 – Calvin1602 Jul 26 '12 at 11:38
  • Thanks a lot for your answer but it's what i did and it's very slow on the iPad... To interleave i do it for each pixel. That's why i thought if i do it with OpenGL blend functions it'll do it in the GPU and it'll be more efficient... – user1554162 Jul 26 '12 at 12:38