I need help understanding why I need to do a specific change to make my OpenGL project work on OS X (2019 Macbook), while without the change it works perfectly on Windows and Linux, on both ATI and NVIDIA hardware.
At some point I'm rendering to a frame buffer that is 1024 pixels wide and 1 pixel high. I need straightforward orthographic projection, so for my projection matrix I use:
glm::ortho(0.f, (float)LookupMapSize, 1.f, 0.f)
With this projection matrix on Windows and Linux, I render my line geometry and it works as expected, all pixels are written to.
On OS X however, I initially saw nothing ending up in my framebuffer, just the color I cleared it to with glClearColor
and glClear
. Suspecting a shader issue I set the fragment output to vec4(1)
expecting an all white result, but I still saw nothing but the clear color in my framebuffer. Depth testing, blending, culling and stencils were not an issue, so it had to be that my matrices were wrong. After much fiddling, I finally figured out that all I had to change my projection matrix to, was this:
glm::ortho(0.f, (float)LookupMapSize, 0.f, 1.f)
But why? Where does this difference come from? So in Windows/Linux bottom is at 1.f, and top is at 0.f, while in OS X it's exactly the other way around. If I use the "OS X" matrix on Windows/Linux, I get the exact same bug I initially had on OS X.
Rather than just keeping this platform specific change in my code, I would like to understand what's going on.
edit: I check all my OpenGL calls automatically (glGetError
), nothing returns any errors anywhere. Unfortunately the OpenGL debug functions (glDebugMessageCallback
) are not available on OS X...
edit: I verified that on both OSX and Linux/Windows the results of glm::ortho are identical. So my input into OpenGL is the same on all platforms.