We have been using a trick copied from the autotools to determine off_t size and whether or not we needed to define _FILE_OFFSET_BITS=64. However this trick seems to fails in recent gcc (>= 4.6). Here is the code:
#include <sys/types.h>
int main(int argc, char **argv)
{
/* Cause a compile-time error if off_t is smaller than 64 bits */
#define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62))
int off_t_is_large[ (LARGE_OFF_T % 2147483629 == 721 && LARGE_OFF_T % 2147483647 == 1) ? 1 : -1 ];
return 0;
}
From my debian/amd64 & debian/ppc installation here is what I get (instead of compile time error):
$ gcc-4.6 -O0 -m32 -o valid.o valid.c
valid.c: In function ‘main’:
valid.c:7:3: warning: left shift count >= width of type [enabled by default]
valid.c:7:3: warning: left shift count >= width of type [enabled by default]
valid.c:7:3: warning: left shift count >= width of type [enabled by default]
valid.c:7:3: warning: left shift count >= width of type [enabled by default]
So my question, can I simply replaced this code with the following one:
#include <sys/types.h>
int main(int argc, char **argv)
{
/* Cause a compile-time error if off_t is smaller than 64 bits */
int off_t_is_large[ sizeof(off_t) >= 8 ? 1 : -1 ];
return 0;
}
(extra) question: is this a regression in gcc, or was the initial code just relying on a broken feature ?