2

I am just getting started with OpenGL, and have already hit a pretty frustrating bug with it. I've followed the learnopengl tutorial, encapsulating most stuff into a renderer class, which has uints for buffers and such. Here is the main code that does everything:

#include <gfx/gfx.h>
#include <gfx/gl.h>

#include <gfx/shaders.h>

#include <iostream>

void Renderer::init() {

    vertex_shader_id = glCreateShader(GL_VERTEX_SHADER);
    glShaderSource(vertex_shader_id, 1, &vertex_shader, nullptr);
    glCompileShader(vertex_shader_id);

    GLint vertex_shader_status;
    glGetShaderiv(vertex_shader_id, GL_COMPILE_STATUS, &vertex_shader_status);
    
    if (vertex_shader_status == false) {
        std::cout << "vsh compilation failed due to";
        char vertex_fail_info_log[1024];
        glGetShaderInfoLog(vertex_shader_id, 1024, nullptr, vertex_fail_info_log);
        std::cout << vertex_fail_info_log << std::endl;
        abort();
    }
    
    fragment_shader_id = glCreateShader(GL_FRAGMENT_SHADER);
    glShaderSource(fragment_shader_id, 1, &fragment_shader, nullptr);
    glCompileShader(fragment_shader_id);

    GLint fragment_shader_status;
    glGetShaderiv(fragment_shader_id, GL_COMPILE_STATUS, &fragment_shader_status);
    
    if (fragment_shader_status == false) {
        std::cout << "fsh compilation failed due to";
        char fragment_fail_info_log[1024];
        glGetShaderInfoLog(fragment_shader_id, 1024, nullptr, fragment_fail_info_log);
        std::cout << fragment_fail_info_log << std::endl;
        abort();
    }
    
    shader_program = glCreateProgram();
    glAttachShader(shader_program, vertex_shader_id);
    glAttachShader(shader_program, fragment_shader_id);
    glLinkProgram(shader_program);

    GLint shader_program_status;
    glGetProgramiv(shader_program, GL_LINK_STATUS, &shader_program_status);
    
    if (shader_program_status == false) {
        std::cout << "shprogram compilation failed due to";
        char shader_program_fail_info_log[1024];
        glGetShaderInfoLog(shader_program, 1024, nullptr, shader_program_fail_info_log);
        std::cout << shader_program_fail_info_log << std::endl;
        abort();
    }
    
    glUseProgram(shader_program);

    glDeleteShader(vertex_shader_id);
    glDeleteShader(fragment_shader_id);

    
}
 
void Renderer::draw(f32 verts[]) {
    
    glUseProgram(shader_program);
    
    glClearColor(1, 0, 0, 1.0);
    glClear(GL_COLOR_BUFFER_BIT);

    glCreateVertexArrays(1, &vertex_array);
    glBindVertexArray(vertex_array);
    
    glCreateBuffers(1, &vbo);
    glBindBuffer(GL_ARRAY_BUFFER, vbo);
    glBufferData(GL_ARRAY_BUFFER, sizeof(verts), verts, GL_STATIC_DRAW);

    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 3 * sizeof(f32), (void*)0);
    glEnableVertexAttribArray(0);

    glBindVertexArray(vertex_array);
    glUseProgram(shader_program);
    glDrawArrays(GL_TRIANGLES, 0, 3);
}

Here is shaders.h :

#ifndef SHADERS_H
#define SHADERS_H

const char* vertex_shader =
"#version 460 core\n"
"layout (location = 0) in vec3 aPos;\n"

"void main() {\n"
"gl_Position = vec4(aPos.x, aPos.y, aPos.z, 1.0);\n"
"}\n\0";


const char* fragment_shader =
"#version 460 core\n"
"out vec4 FragColor;\n"

"void main() {\n"
"FragColor = vec4(0.0f, 1.0f, 0.0f, 1.0f);\n"
"}\n\0";

#endif

I cannot figure out for life of me, what is wrong. The red clear color shows up, but nothing else.

Rabbid76
  • 202,892
  • 27
  • 131
  • 174
Moosa
  • 63
  • 5
  • I am very sure that this code is not from the tutorial. [`sizeof`](https://en.cppreference.com/w/cpp/language/sizeof) does not do what you expect. See [How do I determine the size of my array in C?](https://stackoverflow.com/questions/37538/how-do-i-determine-the-size-of-my-array-in-c) – Rabbid76 Jul 30 '22 at 06:33
  • Yes, I have changed the code a bit from the tutorial. Also, @David Sullivan pointed out this mistake. (but that also doesnt fix the problem) – Moosa Jul 30 '22 at 07:57
  • 1
    Please read [How to create a Minimal, Reproducible Example](https://stackoverflow.com/help/minimal-reproducible-example). – Rabbid76 Jul 30 '22 at 07:59
  • Um, this is literally all there is to the thing. If i remove any other opengl calls, something will stop working. You cannot remove any more from this imho, without leaving everyone in confusion – Moosa Jul 30 '22 at 08:04
  • This is literally all the code there is. No other opengl calls anywhere else. Also, clearing works - which means context is set up correctly – Moosa Jul 30 '22 at 08:08
  • After glfwMakeContextCurrent. – Moosa Jul 30 '22 at 08:48
  • 1
    Do not create VAO (`vertex_array`) and VBO (`vbo`) in every frame, otherwise you will run out of memory. Create the VAO and the VBO in `Renderer::init`. It is sufficient to bind the VAO (`glBindVertexArray(vertex_array)`) before the draw call. – Rabbid76 Jul 30 '22 at 09:02
  • @Rabbid76, it's actually a line copied from the "Vertex Array Object" section of [this][1] tutorial. The difference is in that code, vertices is an array declared in the same scope as that function call, so calling sizeof(vertices) does return the number of bytes in vertices. [1]: https://learnopengl.com/Getting-started/Hello-Triangle – David Sullivan Jul 30 '22 at 18:23
  • @DavidSullivan ? see the very fist comment. Anyway, your fix doesn't solve the problem, so there's another bug somewhere. – Rabbid76 Jul 30 '22 at 18:25
  • @Rabbid76 that's what I'm responding to. You say that code is not from the tutorial, but the line related to `sizeof` is copy and pasted from there. Maybe you mean their code in general is not from the tutorial. Either way I think its worth clarifying why the line works in the tutorial but not their code with more details than "your code is different" (no shade). – David Sullivan Jul 30 '22 at 18:32
  • @DavidSullivan I am not referring to this one line. The code cannot be copied completely from the tutorial because it is wrong. The contributor said *"yes, I changed the code a little"*. So what are you accusing me of? The contributor used `sizeof` incorrectly for whatever reason. That's it. I am not a beginner. – Rabbid76 Jul 30 '22 at 18:36
  • @DavidSullivan Apart from that the wrong use of `sizeof` has nothing to do with OpenGL. If this is the only error the question should not be answered but closed as "duplicate". – Rabbid76 Jul 30 '22 at 18:44
  • @Rabbid76 I'm not trying to accuse you of anything, let alone being a beginner. I have gone through a previous version of this tutorial and made this exact mistake when trying to factor out this line (though into an init function not draw function). Thats all, sorry if it was out of line. – David Sullivan Jul 30 '22 at 18:55

1 Answers1

-1

Looks like there's a problem with your draw method. The signature is void Renderer::draw(f32 verts[]) { Then later on you call glBufferData(GL_ARRAY_BUFFER, sizeof(verts), verts, GL_STATIC_DRAW); . The thing is, when you pass an array to a function, it decays to a pointer (Even though the declaration can make that look like it's not happening, which is very confusing). So in a function parameter, draw(f32 verts[]) is equivalent to draw(f32* verts). This question has some explanations on what's happening there.

Anyways, when you call sizeof(verts), you're just getting the number of bytes of a float pointer, not the number of bytes owned by verts. So you will not be specifying enough bytes when you call glBufferData() to create the triangle you are going for. The simple fix is to pass a length into your draw function, and then you would have something like

    void Renderer::draw(f32* verts, int length) {
        //...
        glBufferData(GL_ARRAY_BUFFER, sizeof(float) * length, verts, GL_STATIC_DRAW);
        //...
    }

Here is some docs on this particular function. It is possible there are other errors, but since you aren't blackscreening, and generally the code looks right, it is unlikely there's a bunch of invalid operations or anything.

To continue debugging after this, add the following to your code

#define GL_ERROR_CHECK() (log_error(__FILE__, __LINE__))

void log_error(const* file, int line) {
    GLenum err;
    while((err = glGetError()) != GL_NO_ERROR) {
        std::cout << "GL error " << err << " in " << file << "at line " << line << std::endl;
    }
}

and sprinkle GL_ERROR_CHECK() all over the place to see if any of the OpenGL calls were invalid.

David Sullivan
  • 462
  • 7
  • 16
  • What you said is absolutely right - I forgot about this. I changed my code to be glBufferData(GL_ARRAY_BUFFER, sizeof(f32) * 9, verts, GL_STATIC_DRAW);, but still - it doesnt display anything. Please help. – Moosa Jul 30 '22 at 07:57
  • can you add the vertex array to your question? also I updated my answer with instructions on how to continue to debug – David Sullivan Jul 30 '22 at 16:23