I was playing around with C on my Mac OS X (version 10.6.8); this is the small program to test what happens when you declare and int
but don't initialize it:
#include <stdio.h>
int main() {
int a;
printf("\n%d\n\n", a);
}
When I do gcc -std=c99 test.c; ./a.out
, I get a predictable result: the max for an unsigned int (32767
). However, something strange happens when I use the -o
using gcc. When I do gcc -std=c99 test.c -o test.out; ./test.out
I get something else: 0
.
It doesn't matter what I name the file or how I compile it, I have figured out. If the name is not a.out
, then I get 0
. When the name is a.out
I get either the largest uint type or some other larger number.
Contrary to some thoughts below, the number is semi-random (mostly largest uint but sometimes a larger number) only when the compiled file is a.out
which is really weird. It could just be my computer; or, perhaps, I have luckily - or unluckily - received the same results 30 times in a row.