0

As I understand GLuint(data type in OpenGL) is exactly same as unsigned int. So, when it`s better use it?

genpfault
  • 51,148
  • 11
  • 85
  • 139
MykolaSharhan
  • 396
  • 7
  • 15

1 Answers1

0

Always use GLuint when you are going to use those variables in OpenGL calls. This is because it's possible that those types will not be unsigned ints in the future, and if your code uses unsigned int, it might break mysteriously in the future. This is true in general for any such library-specific datatypes.

Rich
  • 926
  • 8
  • 17
  • 1
    More precisely, OpenGL has very specific requirements for all of its data types. `GLuint` _must_ always be a 32-bit unsigned integer on all implementations. C++ doesn't give that guarantee when you use `unsigned int`. – Andon M. Coleman Feb 02 '15 at 01:08