5

A previous question has asked for how to cast between integer types and GLvoid* in C++11 (based on the tags for that question) but here I'm interested in C++20.

Now that there is std::bit_cast as an option for doing type conversion, I'm wondering if it would be the "correct" way to use integers with OpenGL functions that for historical reasons take a GLvoid* to represent a byte offset (e.g. glDrawRangeElements), or whether the methods referenced at the previous question should be used instead.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982

2 Answers2

6

bit_cast is definitely the wrong thing here. void* is not required to be the same size as most integer types. bit_cast requires that the source and destination types have the exact same size. Therefore, on 32-bit systems, a std::bit_cast<void*>(24ull) is a compile error. While on 64-bit systems, a simple std::bit_cast<void*>(24) is a compile error.

And not even uintptr_t is required to have the same size as a pointer; only that it has at least as many bits as a pointer. If it has more, then bit_cast will choke on it.

Just do a reinterpret_cast<void*>, if you don't feel like using a C-style cast.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
4

No, although (with an appropriately-sized integer) it will often work in practice, and none of this is strictly-conforming anyway. This is a C API, so it means (void*)offset (in C). If your toolchain has the expected C compatibility), you can write that in C++, in which case it’s equivalent to reinterpret_cast<void*>(offset), but not necessarily to bit_cast<void*>(offset) (which preserves the bit pattern even if reinterpret_cast doesn’t and additionally requires that sizeof offset==sizeof(void*)).

Davis Herring
  • 36,443
  • 4
  • 48
  • 76