-2

I'm trying the whole day to get OpenGL to work in Java (with LWJGL 3/GLFW). The problem is that besides the clear color, I don't see anything drawn.

Question:

  1. Is the order of my OpenGL Function calls ok?
  2. Are there any mistakes that prevents the vertices from being drawn?

This is the order of OpenGL calls, besides the initialization. The complete program can be found here http://pastebin.com/JWJ7pqBs and is derived from the LWJGL 3 "Getting Started"-Example to skip the initialization part.

EDIT: Corrected some mistakes as stated by derhass. But still no visible output. http://pastebin.com/FbkuusM3

EDIT2: Tried with fixed color in fragment-shader, but no vertices drawn.

Variables:

int shader;
int vao;
int vbo;

Create Shader (once):

    String dummyVertexShaderSrc =
                "#version 330 core"
                + "\n" + "layout(location = 0) in vec3 vs_position;"
                + "\n" + "layout(location = 1) in vec4 vs_color;"
                + "\n" + ""
                + "\n" + "out vec4 fs_color;"
                + "\n" + ""
                + "\n" + "void main() {"
                + "\n" + "    gl_Position = vec4(vs_position, 1.0);"
                + "\n" + "    fs_color = vs_color;"
                + "\n" + "}"
                ;

        String dummyFragmentShaderSrc = "#version 330 core"
                + "\n" + "in vec4 fs_color;"
                + "\n" + "void main() {"
                + "\n" + "    gl_FragColor = fs_color;"
                + "\n" + "}";

        System.out.println("Vertex-Shader: \n" + dummyVertexShaderSrc + "\n");
        System.out.println("Fragment-Shader: \n" + dummyFragmentShaderSrc + "\n");

        // 1# Read/Compile VertexShader
        int idVertexShader = GL20.glCreateShader(GL20.GL_VERTEX_SHADER);
        GL20.glShaderSource(idVertexShader, dummyVertexShaderSrc);
        GL20.glCompileShader(idVertexShader);

        if (GL20.glGetShaderi(idVertexShader, GL20.GL_COMPILE_STATUS) == GL11.GL_FALSE) {
            System.err.println("Could not compile vertex shader: " + GL20.glGetShaderInfoLog(idVertexShader));
            System.exit(-1);
        }

        // 2# Read/Compile FragmentShader
        int idFragmentShader = GL20.glCreateShader(GL20.GL_FRAGMENT_SHADER);
        GL20.glShaderSource(idFragmentShader, dummyFragmentShaderSrc);
        GL20.glCompileShader(idFragmentShader);

        if (GL20.glGetShaderi(idFragmentShader, GL20.GL_COMPILE_STATUS) == GL11.GL_FALSE) {
            System.err.println("Could not compile fragment shader: " + GL20.glGetShaderInfoLog(idFragmentShader));
            System.exit(-1);
        }

        // 3# Create Shader-Program
        shader = GL20.glCreateProgram();
        GL20.glAttachShader(shader, idVertexShader);
        GL20.glAttachShader(shader, idFragmentShader);

        GL20.glBindAttribLocation(shader, 0, "vs_position");
        GL20.glBindAttribLocation(shader, 1, "vs_color");

        GL20.glLinkProgram(shader);
        if (GL20.glGetProgrami(shader, GL20.GL_LINK_STATUS) == GL11.GL_FALSE) {
            System.out.println("Shader linking failed: " + GL20.glGetProgramInfoLog(shader));
            System.exit(-1);
        }

        GL20.glValidateProgram(shader);
        GL20.glDeleteShader(idVertexShader);
        GL20.glDeleteShader(idFragmentShader);

Create VAO/VBO (once):

        vao = GL30.glGenVertexArrays();
        GL30.glBindVertexArray(vao);

            vbo = GL15.glGenBuffers();
            GL15.glBindBuffer( GL15.GL_ARRAY_BUFFER, vbo);
            GL15.glBufferData( GL15.GL_ARRAY_BUFFER, 1024 * Vertex.ByteSize, null, GL15.GL_STREAM_DRAW );

            GL20.glEnableVertexAttribArray(0);
            GL20.glVertexAttribPointer(0, 3, GL11.GL_FLOAT, false, 0, 0);

            GL20.glEnableVertexAttribArray(1);
            GL20.glVertexAttribPointer(1, 4, GL11.GL_FLOAT, false, 0, 3);

            GL20.glDisableVertexAttribArray(0);
            GL20.glDisableVertexAttribArray(1);

            GL15.glBindBuffer( GL15.GL_ARRAY_BUFFER, 0);

        GL30.glBindVertexArray(0);  

Fill the VAO/VBO with data (once):

    Vertex[] vertices = new Vertex[] {
                new Vertex(new Vector3f(-1.0f, -1.0f, 0.0f), new Color4f(0.5f, 0.5f, 0.5f, 0.5f), Vector2f.Zero),
                new Vertex(new Vector3f( 1.0f, -1.0f, 0.0f), new Color4f(0.5f, 0.5f, 0.5f, 0.5f), Vector2f.Zero),
                new Vertex(new Vector3f( 0.0f,  1.0f, 0.0f), new Color4f(0.5f, 0.5f, 0.5f, 0.5f), Vector2f.Zero),
        };

        // 1# Create buffer
        FloatBuffer buffer = ByteBuffer.allocateDirect(3 * 7 * Float.BYTES).asFloatBuffer();
        buffer.position(0);
        for(Vertex vertex : vertices) {
            buffer.put(vertex.position.x);
            buffer.put(vertex.position.y);
            buffer.put(vertex.position.z);
            buffer.put(vertex.color.r);
            buffer.put(vertex.color.g);
            buffer.put(vertex.color.b);
            buffer.put(vertex.color.a);
        }

        // 2# Write data
        GL30.glBindVertexArray(vao);

            GL15.glBindBuffer( GL15.GL_ARRAY_BUFFER, vbo);
            GL15.glBufferSubData( GL15.GL_ARRAY_BUFFER, 3 * 7, buffer);
            GL15.glBindBuffer( GL15.GL_ARRAY_BUFFER, 0);

        GL30.glBindVertexArray(0);

And finally in the main loop draw the vertices (multiple times):

        GL20.glUseProgram(shader);
        GL30.glBindVertexArray( vao );
        GL20.glEnableVertexAttribArray(0);
        GL20.glEnableVertexAttribArray(1);

        GL11.glDrawArrays( GL11.GL_TRIANGLES, 0, 3 );

        GL20.glDisableVertexAttribArray(1);
        GL20.glDisableVertexAttribArray(0);
        GL30.glBindVertexArray( 0 );
        GL20.glUseProgram(0);
Basti Funck
  • 1,400
  • 17
  • 31
  • The shader compilation is successful? `gl_FragColor` is not a predefined variable in the core profile. So the compilation of the fragment shader should fail. – Reto Koradi Apr 26 '15 at 18:30
  • @RetoKoradi: I just looked it up: actually, it is valid GL: it is _deprecated_ in 3.30 core, but not removed yet. – derhass Apr 26 '15 at 18:33
  • Shader compilation is successful, what is the replacement for setting a vertex's color? – Basti Funck Apr 26 '15 at 18:43
  • @BastiFunck: the replacement for writing a _fragment's_ color(s) is declaring your own `out` variables in the fragment shader (and possibly binding them to the correct color numbers if you use multiple render targets). – derhass Apr 26 '15 at 18:45
  • try simplifying the fragment shader to output a constant color in your fragment shader, e.g. out_color = vec4(1.0, 0.0, 0.0, 1.0); if this works it means you are passing your vertex attributes (color/pos) incorrectly –  Apr 26 '15 at 19:56
  • Tried that and also played a little bit around with the position vector in the vertex-shader. Even shuffled the vertex positions in the buffer around a bit to hopefully create atleast a malformed polygon. Still no success – Basti Funck Apr 26 '15 at 20:13
  • @derhass Yes, looks like it's still valid if it's just a plain core profile. Not if you're using a forward compatible context, though. Most people probably use a forward compatible context when they use the core profile. – Reto Koradi Apr 26 '15 at 20:14

2 Answers2

1

When going trhough your code, I spot a couple of mistakes:

GL20.glVertexAttribPointer(1, 4, GL11.GL_FLOAT, false, 0, 3);

You probably are using interleaved attributes, however, the offset of 3 bytes is certainly wrong here. You seem to want to skip the 3 floats of attribute 0, so this should most likely be 3 *sizeof(GLfloat).

Furthermore, your stride parameter is also wrong: you are telling the GL that your attributes are tightly packed each. You probably want 7*sizeof(GLfloat) as the stride:

The following is unneeded:

GL20.glDisableVertexAttribArray(0);
GL20.glDisableVertexAttribArray(1);

The VAO will keep track of the enable bits as well as the pointer values. You don't need do disable them here in this VAO, and re-enable and disable them at draw time. Just keep them enabled, and for drawing, bind the VAO.

When uploading your data: GL15.glBufferSubData( GL15.GL_ARRAY_BUFFER, 3 * 7, buffer);

you forgot that these sizes and offsets are in bytes, so you need to multiply by the float size again.However, I don't fully understand what you are doing here. glBufferSubData() wants an offset and a size. The size pis probable removed from the java API since it is inmplicitely known in the buffer object, so why are you using an offset different from 0 here?

derhass
  • 43,833
  • 2
  • 57
  • 78
  • Corrected my offsets (misunderstood the API), removed the unnessecaries `Enable/DisableVertexArray` & fixed the `glBufferSubData` mistake (that offset wasn't intended). Corrected version of the code is in my question. But sadly still no visible output. – Basti Funck Apr 26 '15 at 19:09
  • 2
    @BastiFunck You're still disabling the vertex attributes. Also, generally you shouldn't edit your question to fix problems that were pointed out in answers. This invalidates the answer, and the goal of SO is to build a repository of questions and matching answers. – Reto Koradi Apr 26 '15 at 20:19
0

The problem is that ByteBuffer.allocateDirect allocates the buffer in big endian format, no matter on what machine the code runs. ("All x86 and x86-64 machines [..] are little-endian")

The solution is either swap the bits manually or just use the BufferUtils class of LWJGL which creates the ByteBuffer in the format the current machine works with.

Community
  • 1
  • 1
Basti Funck
  • 1,400
  • 17
  • 31