Edit: The reason I am asking
I think I have to clarity that the platform I am using and the reason I am asking to make it easier for you to answer this question: I am using a x86_64 machine with gcc and Ubuntu and I am working on some interpreter of some toy language and I think tagged pointer is a neat trick can be used. I know Apple is using it. So I just want to try it out.
I was reading something about tagged pointer and I was wondering, how can I know how many free bits are there in a pointer on a particular machine.
For now my understanding is that if I am using a 64 bit machine, then when accessing memory the CPU will always access memory address that are multiple of 8 bytes. So it leaves 2 bits in the end of a pointer always set to 0. Also, if on a x86_64 machine, the first 14 bits will be always 0 right? Since they are never used by the CPU. malloc
will make sure that the pointers it gives back will always be aligned. But how about other memory locations? Say variables on the stack?
How can I confirm this?
Somebody in the comments suggested that 2 bits I mentioned in the above is not right, indicating I am a bad programmer. I do not deny that I am not a very professional programmer, but I think I shall explain a bit a bout why I said 2 instead of 3.
I wrote a very simple program like this:
#include <stdio.h>
#include <stdlib.h>
int main() {
int a = 0;
printf("%p\n", &a);
int *p = malloc(sizeof(int));
printf("%p\n", p);
}
And I compiled it with gcc and run it over 10000 iteration on a 64 bit machine with Ubuntu. I discovered that &a
always end up with last 4 bits as 1100
and p always end up with 0000
, so I want to be conservative about how many bits are actually unused by the compiler. That's why I said 2 instead of 3.
Also if you can help me explain what I have been observed (&a
ends up with 1100, which only has 2 bits set to 0), I would deeply appreciate it.