3

I'm using glTexSubImage2D with GL_LUMINANCE and GL_UNSIGNED_BYTE to display raw greyscale data from a camera directly - rather than having to repack it into an RGB bitmap manually.

I would like to run the camera in higher resolution mode, with 12 or 14bits/ pixel.

I can do this with simply by setting GL_SHORT but the camera returns data in big endian and my openGL implementation seems to be drawing it the wrong way around (on x86).

Is there a simple way of telling openGL that the textures are the 'wrong' way round? I would like to avoid manually byteswaping the data just for display because all the other functions expect big endian data.

Martin Beckett
  • 94,801
  • 28
  • 188
  • 263

1 Answers1

7

Check out the glPixelStore* group of functions.

You might need to play with GL_UNPACK_SWAP_BYTES or GL_UNPACK_LSB_FIRST but double check you're using the correct GL_UNPACK_ALIGNMENT. By default the unpack alignment is 4, but if you're using one byte per pixel (lum / ub), you'll want to set that to 1. I ran into this problem just recently, and it took me longer to figure out than I'd care to admit :)

luke
  • 36,103
  • 8
  • 58
  • 81
  • If it makes you feel any better, the first time I encountered this problem I wrote a short GLSL program to fix it. I felt quite foolish when I eventually spotted GL_UNPACK_ALIGNMENT. – Tommy Mar 02 '11 at 20:58
  • @Tommy It's also #8 on opengl.org's list of common pitfalls. http://www.opengl.org/resources/features/KilgardTechniques/oglpitfall/ Silly Us. – luke Mar 02 '11 at 21:01
  • 1
    Excellent - `glPixelStorei(GL_UNPACK_SWAP_BYTES,1);` it is. That's the great thing about openGL, somebody has already solved it - you just need to know where. – Martin Beckett Mar 02 '11 at 21:06