I was running a simple code on my laptop (Ubuntu Mate 64 bit)
# include <stdio.h>
int main()
{
int i,j,l=0,swap,n,k,a[100],b[100],count;
printf("%d", count);
}
As expected, this should return garbage value, and it was doing exactly that. I also ran it several times. I was getting different results everytime, which indicates that fresh memory was allocated everytime. The output was something like this.
32576
33186
0
29318
0
32111
0
However, notice that for some instances I was getting zero
Now this same code was executed in a solaris server. I ran it on a thin-client machine connected to this solaris server. But this time, the program printed only 0, no matter how many times I executed the program. I logged into the server from a different account and still I got the same result. I thought that the solaris machine has some sort of inherant garbage collector, but it was not true. When I ran another program, I was getting garbage values.
# include <stdio.h>
int main()
{
int i;
printf("%d", i);
}
My question is that is there some difference between the two platforms or is it that I am just a victim of randomness?