On some old types of machine (those on which I learned to program C), the value of the char *
address for a memory location was not the same as the int *
address for the same location, even assuming that the address was sufficiently well aligned. The machine in question was an ICL or Three Rivers machines called the Perq.
The Perq was a micro-coded machine — you could add to its instruction set if you were good enough. Its basic addressing unit was a 16-bit word. The fact that an address was a character pointer, and whether it was the even or odd byte that was being addressed, was encoded in high-order or most significant address bits, not the least significant or low-order address bits. This was in the days when an increase from 1 MiB to 2 MiB of main memory gave about 5 times as much memory for programs to run in — because the o/s used about 3/4 MiB; handling gigabytes of memory was not considered (heck, the disk drives were much smaller than 1 GiB, let alone the main memory). Consequently, 'wasted bits' in the addresses were not an issue either.
This was also in the days long before the oldest C standard. The memory allocation functions like malloc()
were defined as:
extern char *malloc(); /* No prototypes - no <stdlib.h> either */
And woe betide you if you forgot to declare the function before using it. The returned value would be horribly mismanaged and your program would crash. And, in those days, it was necessary to cast the result of malloc()
. The compilers in those days didn't have options to help you spot the problems; the lint
program, now relegated to the dustbin of history, was a crucial check to help ensure that your program was coded correctly.
So, on some real (but now archaic) systems, casts would really make changes to the bit pattern of a pointer.
Some mainframe systems encoded type information in their pointers — ICL 2900 Series machines were an example of that. C was hard to port to such systems.