2

I am Trying to Render 3 textures, -Background -Black/White Foreground Mask -Foreground

I have used this OpenGL - mask with multiple textures because it acurately descirbes my problem. But i can not get it to work. I only get the Last rendererd Texture, in this case the Foreground. I have called glutInitDisplayMode(GLUT_ALPHA); to get Alpha rendering as sugested in the Answer. Can anyone spot errors from my side?

My code is as follows:

double stretch = ((double)m_videoResY * (double)m_depthResX) / ((double)m_videoResX * (double)m_depthResY);

glEnable(GL_BLEND);
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glLoadIdentity();
glOrtho(+0.5, -0.5, +0.5, -0.5, 0.001f, 1.0);

glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadIdentity();
glTranslatef(0.0f, 0.0f, -0.5f);

glEnable(GL_TEXTURE_2D);
glActiveTexture(GL_TEXTURE0);
glDisable(GL_DEPTH_TEST);

glClearColor(0, 0, 0, 0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBlendFunc(GL_ONE, GL_ZERO);

glBindTexture(GL_TEXTURE_2D, m_backgroundTexture);//Draw BGTexture

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);

glBegin(GL_QUADS);
    glTexCoord2f(1.0f, 1.0f);
    glVertex3f(-0.5f, -0.5f, 0.0f);
    glTexCoord2f(1.0f, 0.0f);
    glVertex3f(-0.5f, 0.5f, 0.0f);
    glTexCoord2f(0.0f, 0.0f);
    glVertex3f(0.5f, 0.5f, 0.0f);
    glTexCoord2f(0.0f, 1.0f);
    glVertex3f(0.5f, -0.5f, 0.0f);
glEnd();


glBlendFuncSeparate(GL_ZERO, GL_ONE, GL_SRC_COLOR, GL_ZERO);

//mask with userID
glBindTexture(GL_TEXTURE_2D, m_userIDTexture);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);


glBegin(GL_QUADS);
    glTexCoord2f(1.0f, 0.0f);
    glVertex3f(-0.5f, -0.5f, 0.0f);
    glTexCoord2f(1.0f, 1.0f * stretch);
    glVertex3f(-0.5f, 0.5f, 0.0f);
    glTexCoord2f(0.0f, 1.0f * stretch);
    glVertex3f(0.5f, 0.5f, 0.0f);
    glTexCoord2f(0.0f, 0.0f);
    glVertex3f(0.5f, -0.5f, 0.0f);
glEnd();



//blend with Video of User
glBlendFunc(GL_DST_ALPHA, GL_ONE_MINUS_DST_ALPHA);


glBindTexture(GL_TEXTURE_2D, m_videoTexture);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);

glBegin(GL_QUADS);
glTexCoord2f(1.0f, 0.0f);
    glVertex3f(-0.5f, -0.5f, 0.0f);
    glTexCoord2f(1.0f, 1.0f);
    glVertex3f(-0.5f, 0.5f, 0.0f);
    glTexCoord2f(0.0f, 1.0f);
    glVertex3f(0.5f, 0.5f, 0.0f);
    glTexCoord2f(0.0f, 0.0f);
    glVertex3f(0.5f, -0.5f, 0.0f);
glEnd();
Community
  • 1
  • 1
Skofgar
  • 21
  • 3
  • 1
    Can you double check that you really got alpha planes? Use `GLint val; glGetIntegerv(GL_ALPHA_BITS, &val);`. When you say that you called `glutInitDisplayMode(GLUT_ALPHA)`, I figure you added `GLUT_ALPHA` to your initial `glutInitDisplayMode` call? I don't think you can change the mode once the window is created. – Reto Koradi Apr 27 '14 at 16:02
  • @Skofgar: You are right to request destination alpha bit-planes, but you have to understand that is not common (some equally uncommon implementations do not even support it). It is almost never used in typical rendering, so even though your pixel format is usually 32-bit, you often wind up with something like RGBx (what would/could have been used for Alpha, `x`, becomes padding for alignment) unless you explicitly ask for a pixel format with > 0-bits of destination alpha. – Andon M. Coleman Apr 27 '14 at 16:28
  • `GLUT_ALPHA` is supposed to do that, but if you use a bad combination of other parameters you can wind up with a GDI pixel format (braindead software renderer on Windows, which does not support destination alpha). The only way to know for sure what is going on is to request some information about your pixel format ***after*** GLUT gives it to you, all platforms use a pattern-matching system for pixel format selection and will return whatever they think is closest to what you asked for. – Andon M. Coleman Apr 27 '14 at 16:30
  • @RetoKoradi I have 8 Alpha Bitplanes according to glGetIntegerv. Is that good or bad? – Skofgar Apr 28 '14 at 07:51
  • @AndonM.Coleman How do i request said pattern match to see my pixel format? – Skofgar Apr 28 '14 at 08:25
  • You actually don't do this. GLUT hides this from you. I was just explaining how things worked under the hood so you might know why `GLUT_ALPHA` by itself might not do what you want. If you request a 32-bit depth buffer, for instance, it often forces you onto a software pixel format. – Andon M. Coleman Apr 28 '14 at 12:11
  • @Skofgar: That's good. It means that you got 8 bits of alpha in your framebuffer. You probably want something like `glutInitDisplayMode(GLUT_RGB | GLUT_ALPHA | GLUT_DEPTH | GLUT_DOUBLE)`. Well, it's bad in the sense that it takes away the most obvious explanation why your code isn't working... – Reto Koradi Apr 28 '14 at 15:33

1 Answers1

1

I suppose your mistakes are:

When you drawing your background with glBlendFunc(GL_ONE, GL_ZERO); As result you normaly draw it in framebuffer, but providing no needed blending operation, more effective on this pass is don't use blending at all. So, more effective is glDisable(GL_BLEND), but your pass work here like you expect.

At second pass you drawing with glBlendFuncSeparate(GL_ZERO, GL_ONE, GL_SRC_COLOR, GL_ZERO); I don't know why you using so sofisticated function here and separatively blend colors and alpha values. So, looking on third pass, I suppose you want to modify your background alpha value by your foreground black/white color mask. If It's true, you must use glBlendFuncSeparate(GL_ZERO, GL_ONE, GL_ZERO, GL_SRC_COLOR); - Yep, little mistake.

And at third pass when you drawing foreground you have glBlendFunc(GL_DST_ALPHA, GL_ONE_MINUS_DST_ALPHA), what means you wanna draw those regions, where your black/white mask was white and blending with attenuation for more darker mask regions.

If you have any questions about glBlendFunc, I can help you.

xmash
  • 61
  • 6