I have read this post and the answers indicate a behavior described in a paragraph below. I am not trying to make it work on my machine, or find a workaround to make it work on my machine, it is a question of is it defined behavior according to the standard.
Consider the following code which creates an int variable, an int-reference variable, and prints out the result of calling the address-operator on the int-reference variable
#include <iostream>
int main() {
int a = 70;
int& b = a;
std::cout << &b << std::endl;
return 0;
}
It prints out what I would expect, which is an address in memory, i.e., the address of int variable a.
But now I change int to char, or unsigned char, or signed char, and both on Xcode (Version 6.4) and Visual Studio (VS 2013 Ultimate) I get unexpected behavior.
#include <iostream>
int main() {
// or unsigned char or signed char, same weird behavior
char a = 70;
char& b = a;
std::cout << &b << std::endl;
return 0;
}
In Xcode, the console prints something like F\330\367\277_\377 . I get that F is the ASCII code for 70, but I do not understand the rest of it. I assume it is also a set of ASCII characters, since on Visual Studio it prints out the F followed by some weird characters.
I tried other integer types and it worked fine. And I know that often char/signed char/unsigned char or some combination of them are implemented as the same type. The only thing I can think of is that the reference type is being implemented as a pointer type and then interpreting the call to &b as returning a pointer type, and then std::cout is taking its input to mean to print out all characters in a char array.
Is this defined behavior?
To reiterate: my question is more specifically, is this a defined behavior which is part of the standard, is this behavior not defined by the standard, is this a non-standard implementation of the compilers? Something else?