5

I'm in the midst of porting some code from OpenGL ES 1.x to OpenGL ES 2.0, and I'm struggling to get transparency working as it did before; all my triangles are being rendered fully opaque.

My OpenGL setup has these lines:

// Draw objects back to front
glDisable(GL_DEPTH_TEST);

glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

glDepthMask(false);

And my shaders look like this:

attribute vec4 Position;
uniform highp mat4 mat;
attribute vec4 SourceColor;
varying vec4 DestinationColor;

void main(void) {
  DestinationColor = SourceColor;
  gl_Position = Position * mat;
}

and this:

varying lowp vec4 DestinationColor;

void main(void) {
  gl_FragColor = DestinationColor;
}

What could be going wrong?

EDIT: If I set the alpha in the fragment shader manually to 0.5 in the fragment shader (or indeed in the vertex shader) as suggested by keaukraine below, then I get transparent everything. Furthermore, if I change the color values I'm passing in to OpenGL to be floats instead of unsigned bytes, then the code works correctly.

So it looks as though something is wrong with the code that was passing the color information into OpenGL, and I'd still like to know what the problem was.

My vertices were defined like this (unchanged from the OpenGL ES 1.x code):

typedef struct
{
  GLfloat x, y, z, rhw;
  GLubyte r, g, b, a;
} Vertex;

And I was using the following code to pass them into OpenGL (similar to the OpenGL ES 1.x code):

glBindBuffer(GL_ARRAY_BUFFER, glTriangleVertexBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(Vertex) * nTriangleVertices, triangleVertices, GL_STATIC_DRAW);
glUniformMatrix4fv(matLocation, 1, GL_FALSE, m);
glVertexAttribPointer(positionSlot, 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*)offsetof(Vertex, x));
glVertexAttribPointer(colorSlot, 4, GL_UNSIGNED_BYTE, GL_FALSE, sizeof(Vertex), (void*)offsetof(Vertex, r));
glDrawArrays(GL_TRIANGLES, 0, nTriangleVertices);
glBindBuffer(GL_ARRAY_BUFFER, 0);

What is wrong with the above?

Community
  • 1
  • 1
Rich
  • 7,348
  • 4
  • 34
  • 54

3 Answers3

6

Your Colour vertex attribute values are not being normalized. This means that the vertex shader sees values for that attribute in the range 0-255.

Change the fourth argument of glVertexAttribPointer to GL_TRUE and the values will be normalized (scaled to the range 0.0-1.0) as you originally expected.

see http://www.khronos.org/opengles/sdk/docs/man/xhtml/glVertexAttribPointer.xml

GuyRT
  • 2,919
  • 13
  • 8
  • Probably not as stupid as I did when I realized a bug in my code left that flag completely undefined! – GuyRT Jun 17 '13 at 09:03
2

I suspect the DestinationColor varying to your fragment shader always contains 0xFF for the alpha channel? If so, that is your problem. Try changing that so that the alpha actually varies.

Update: We found 2 good solutions:

  1. Use floats instead of unsigned bytes for the varyings that are supplied to the DestinationColor in the fragment shader.

  2. Or, as GuyRT pointed out, you can change the fourth argument of glVertexAttribPointer to GL_TRUE to tell OpenGL ES to normalize the values when they are converted from integers to floats.

ClayMontgomery
  • 2,786
  • 1
  • 15
  • 14
  • I think it always contains 1, actually, but you're correct that it's wrong once it gets to the fragment shader. (See edit, above.) Any ideas? – Rich Jun 13 '13 at 10:14
  • The input color values must be floats in the range 0.0 to 1.0. Because you were using bytes, all integer values above 0 were being translated to values 1.0 and above, which saturate to 1.0 to give you fully opaque. – ClayMontgomery Jun 13 '13 at 21:29
  • Thanks! I figured that was what was happening, but I didn't realise it was simply because you cannot use unsigned bytes for color in OpenGL ES 2.0. If you edit your answer to include this info I'll delete my answer and accept yours. – Rich Jun 14 '13 at 08:44
  • 1
    You can use bytes for colour. Just change the fourth argument of `glVertexAttribPointer` to `GL_TRUE` and the values will be normalized as you originally expected. – GuyRT Jun 14 '13 at 10:52
1

To debug this situation, you can try setting constant alpha and see if it makes a difference:

varying lowp vec4 DestinationColor;

void main(void) {
  gl_FragColor = DestinationColor;
  gl_FragColor.a = 0.5; /* try other values from 0 to 1 to test blending */
}

Also you should ensure that you're picking EGL config with alpha channel.

And don't forget to specify precision for floats in fragment shaders! Read specs for OpenGL GL|ES 2.0 (http://www.khronos.org/registry/gles/specs/2.0/GLSL_ES_Specification_1.0.17.pdf section 4.5.3), and please see this answer: https://stackoverflow.com/a/6336285/405681

Community
  • 1
  • 1
keaukraine
  • 5,315
  • 29
  • 54
  • Thanks. That was helpful. It looks as though I'm doing something wrong with the way I'm passing in unsigned bytes. (See edit to question.) Any ideas? – Rich Jun 13 '13 at 10:12