4

It seems that I can't use gl_FragDepth on my computer. My program works well otherwise, no glsl error, glGetError returns 0, but I can't write in the depth buffer from my fragment shader.

Besides, writing in gl_FragDepth changes the red component of the pixel color.

There is simplified version of my program. I pruned all the useless stuff (I gess?), and it does not work much better:

int        main(void)
{
//  These are custom structures for handling shader programs.
    t_glprog                prog;
    t_glprog                prog2;

    GLuint                  vbo;
    GLFWwindow              *window;
    static const GLfloat    vertab[] =
    {
        -1.0, -1.0, 1.0,
        1.0, -1.0, 1.0,
        1.0, 1.0, 1.0,
        -1.0, 1.0, 1.0
    };

    char const *vert =
        "attribute vec3 Coord;"
        "void main() {\
        gl_Position = vec4(Coord, 1.0);\
        }";

    char const *frag1 =
        "void main () {\
        gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0);\
        gl_FragDepth = sin(gl_FragCoord.x * 0.1);\
        }";

    char const *frag2 =
        "void main () {\
        gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);\
        gl_FragDepth = cos(gl_FragCoord.x * 0.1);\
        }";

    if (!glfwInit())
    {
        fprintf(stderr, "GLFW failed to init.\n");
        return (-1);
    }

    glfwWindowHint(GLFW_DEPTH_BITS, 64);
    window = glfwCreateWindow(640, 480, "TEST", NULL, NULL);
    if (window == NULL)
    {
        fprintf( stderr, "Failed to open GLFW window.\n" );
        glfwTerminate();
        return (-1);
    }
    glfwMakeContextCurrent(window);

//  For Windows.
    if (glewInit() != GLEW_OK)
    {
        fprintf(stderr, "Failed to initialize GLEW\n");
        return (-1);
    }

    glfwSetInputMode(window, GLFW_STICKY_KEYS, GL_TRUE);

    glEnable(GL_DEPTH_TEST);
    glDepthFunc(GL_LESS);
    glClearDepth(1.0);
    glViewport(0, 0, 640, 480);

    glGenBuffers(1, &vbo);
    glBindBuffer(GL_ARRAY_BUFFER, vbo);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertab), vertab, GL_STATIC_DRAW);

    create_shaders_prog(&prog, vert, frag1);
    create_shaders_prog(&prog2, vert, frag2);

    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    glEnableVertexAttribArray(0);
    glBindBuffer(GL_ARRAY_BUFFER, vbo);
    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);

    glUseProgram(prog.prog);
    glDrawArrays(GL_QUADS, 0, 4);

    glUseProgram(prog2.prog);
    glDrawArrays(GL_QUADS, 0, 4);

    glFlush();
    glfwSwapBuffers(window);

    while (glfwGetKey(window, GLFW_KEY_ESCAPE) != GLFW_PRESS &&
            glfwWindowShouldClose(window) == 0)
    {
        glfwPollEvents();
    }
    return (0);
}

It's supposed to draw red and green stripes, but instead I get blurred red lines. And if I remove the second drawcall, it's the same. On Windows, that is. I tested it on OSX and works as expected.

Here is some specs from glGetString :

    GL_VENDOR : Intel
    GL_RENDERER : Mobile Intel(R) 4 Series Express Chipset Family
    GL_VERSION : 2.1.0 - Build 8.15.10.1892
    GL_SHADING_LANGUAGE_VERSION : 1.20 - Intel Build 8.15.10.1892
Procrade
  • 689
  • 1
  • 7
  • 10
  • Is it possible to use OpenGL 3.2 or later in your app, instead of OpenGL 2.1? – Matt Fichman Sep 21 '15 at 21:36
  • Don't you have to draw inside the event loop? – Reto Koradi Sep 22 '15 at 02:16
  • My PC does not support OpenGL 3.2, and I thought gl_FragDepth is available since glsl 1.1 . And well, I don't need to loop, there is no animation. – Procrade Sep 22 '15 at 13:30
  • @Procrade What integrated graphics chipset do you have? I had an older Dell laptop that struggled with similar problems when using multiple render targets. It turned out that the driver was buggy, so I just couldn't use that feature. If you give us the chipset info, then we can search to see if your problem is a known issue. – Matt Fichman Sep 22 '15 at 13:56
  • @MattFichman It's an Intel GMA 4500M. – Procrade Sep 23 '15 at 12:59
  • @Procrade, have you tried upgrading your GPU driver recently? – Matt Fichman Sep 25 '15 at 15:39

1 Answers1

1

Is it possible that your integrated graphics card driver is choking on this line?

glfwWindowHint(GLFW_DEPTH_BITS, 64);

64 bits is an awful lot for a depth buffer. 24 bits is a more typical value. Context creation should fail if a 64-bit depth buffer isn't supported, but I've seen strange behavior from some OpenGL drivers if the depth buffer isn't set up properly.

Matt Fichman
  • 5,458
  • 4
  • 39
  • 59
  • As I read the GLFW docs, that's a hint, not a hard constraint. It should just get as many depth bits as possible. – JWWalker Sep 21 '15 at 21:16
  • 1
    Yes, true. However, that assumes that the driver returns an error to GLFW if you attempt to use a bad depth buffer setting. Unfortunately, drivers (especially Windows OpenGL drivers for integrated graphics cards) aren't always very good at following the spec. Anyway, figured it might be worth a shot, so I posted the answer :) – Matt Fichman Sep 21 '15 at 21:34
  • @JWWalker I agree with Matt most cards support correctly only up to 24 bit / Depth buffer some better ones up to 32 (but usually only if the other buffers are with smaller bit-width or resolution or both) I newer saw better that that. Some bad gfx drivers will allow you to use bigger depth buffer even if the HW do not support it (last time I see this was on integrated SiS gfx card) see [What is the proper OpenGL initialisation on Intel HD 3000?](http://stackoverflow.com/q/19099162/2521214) how I init the pixel format ... – Spektre Sep 22 '15 at 07:03
  • Tested it, with glfwWindowHint(GLFW_DEPTH_BITS, 24), or removing the call: Nothing changed :( – Procrade Sep 22 '15 at 13:28