5

I was trying to draw pixel perfect bitmaps using quads in OpenGL... and to my surprise, a very important row of pixels was missing: (green is ok, red is bad)

The sizes of the quads were 30x30 px, with the red 2x2 in the top left corner.

enter image description here

  • Can someone explain what is happening?
  • How does OpenGL decide where to put pixels?
  • Which correction should I choose?

The texture uses GL_NEAREST.

    glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);

I don't use glViewport and glOrtho. Instead, I send the screen dimensions through uniforms to the shaders:

uniform float cw;
uniform float ch;
...
gl_Position = vec4(     -1.0f + 2.0f*(in_pos.x + in_offset.x)/cw, 
                         1.0f - 2.0f*(in_pos.y + in_offset.y)/ch,
                         0.0f, 1.0f);

But that doesn't seem to be the source of the problem because when I add (0.5,0.5) in the shader (equivalent to the 3rd case), the problem goes away...

Ok, now I realize that the third option doesn't work either:

enter image description here

thehorseisbrown
  • 390
  • 2
  • 14
  • The line would miss only if you scale down the sprite. Why are you surprised? – Vlad Nov 17 '19 at 18:09
  • What do you mean? I'm only scaling vertex coordinates inside the shader. Texture coordinates are always (0,0),(1,1)... And the bitmap data is constant for all quads and correct. – thehorseisbrown Nov 17 '19 at 18:15
  • There must be some downscaling. Shifting around half a pixel, or making it larger by half a pixel - would never make lines disappear. If you change the filtering to linear, you should see that line, and this would confirm there is a downscaling. – Vlad Nov 17 '19 at 18:21
  • How sure are you of that statement? Can you prove that GL_NEAREST can't produce a missing row when it's 1:1? I don't do any scaling. – thehorseisbrown Nov 17 '19 at 18:24
  • Yes, I'm pretty sure about it, though likely won't produce pretty formal proof :) As a wild guess, check that you've passed correct screen size. If you're on windows, make sure you've used a client rectangle area rather than the entire window size. – Vlad Nov 17 '19 at 18:53
  • Another piece of info needed - your sprite size is 30x30px. Do you use the texture rectangles, or your textures are pot 32x32? That would be a source of downscaling too. If you're using pot textures for non-pot sprites, your texture coordinates needs to be recalculated properly, so no scaling would happen. – Vlad Nov 17 '19 at 18:57
  • You guessed it right, GetWindowRect - > GetClientRect solved it and it was a scaling issue. *facepalm* Thank you! What does pot mean? I really don't scale textures at all. I just write pixels with D2D. If you make it into an answer I'll accept it:). – thehorseisbrown Nov 17 '19 at 19:02
  • "pot" means "(having a size equal to a) **p**ower **o**f **t**wo". – HolyBlackCat Nov 17 '19 at 19:16

1 Answers1

4

Check that you've passed correct screen size into shader.

When you create window, you define its outer size, but the client area, where the rendering context is put, can be smaller when window has header/decorations.

So you should either ask the client area size with GetClientRect(), or find the "correct" window size by calling AdjustWindowRect() before creating window - this would give you the window size that will have client area of the size you wanted initially.

Vlad
  • 5,450
  • 1
  • 12
  • 19