In standard C, integer types may have identical size, alignment, and representation, but still be different types with different conversion rank. For example, this applies to unsigned long
and unsigned long long
on x86_64-linux
. It’s possible to distinguish such types using the _Generic
feature of C11, even through a typedef:
#if 0 /* valid shell script */
${CC-cc} -std=c11 -Wall -o "${0%.c}" "$0" && exec "${0%.c}"
#endif
#include <assert.h>
#include <limits.h>
#include <stddef.h>
#if !defined __linux__ || !defined __LP64__
#error wrong platform
#endif
int main(void) {
size_t n;
assert((size_t)-1 == ULONG_MAX && (size_t)-1 == ULLONG_MAX);
assert(sizeof(size_t) == sizeof(unsigned long) &&
sizeof(size_t) == sizeof(unsigned long long));
assert(_Alignof(size_t) == _Alignof(unsigned long) &&
_Alignof(size_t) == _Alignof(unsigned long long));
assert(_Generic(n,
unsigned long long: !"should not be chosen",
unsigned long: "should be chosen"));
return 0;
}
It’s also possible to make a program fail to compile if the types are different, as you’d want in a configure
script:
#include <stddef.h>
int main(void) { (void)sizeof((size_t *)0 - (unsigned long long *)0); return 0; }
Can you detect the difference before C11 without having the compilation fail and without resorting to compiler-dependent features such as GNU cpp’s __SIZE_TYPE__
?