1

Consider the following example code:

#include <stdio.h>
#include <stdlib.h>

int main(void) {
    int *a = malloc(sizeof *a);
    *a = 5;
    free(a);
    return 0;
}

In this example, I allocate the integer a on the heap and initialize it to 5. This line specifically

int *a = malloc(sizeof *a);

is what is confusing me (the sizeof *a part). To me, this looks like I am trying to get the size of the variable before it is even created, but I see this style for initializing pointers is extremely common. When I compile this code with clang, I don't get any errors or warnings. Why does the compiler allow this? As far as I can tell, this is akin to doing something like

int a = a + 1;

without any previous declaration of a. This produces a warning with clang -Wall main.c:

main.c:17:13: warning: variable 'a' is uninitialized when used
      within its own initialization [-Wuninitialized]
    int a = a + 1;

What makes this line different from the pointer declaration with sizeof?

Arnav Borborah
  • 11,357
  • 8
  • 43
  • 88

2 Answers2

4

The operand of the sizeof operator is not evaluated unless it is a variable length array. It is only looked at to determine its type.

This behavior is documented in section 6.5.3.4p2 of the C standard:

The sizeof operator yields the size (in bytes) of its operand, which may be an expression or the parenthesized name of a type. The size is determined from the type of the operand. The result is an integer. If the type of the operand is a variable length array type, the operand is evaluated; otherwise, the operand is not evaluated and the result is an integer constant.

In this case, it knows that *a has type int, so *a is not evaluated and sizeof *a is the same as sizeof(int).

dbush
  • 205,898
  • 23
  • 218
  • 273
3

For most cases sizeof is a compile-time operator. The compiler simply knows the size of the type you pass to it.

Secondly, actually at the time malloc is called the variable a actually have been defined. The variable must have been defined (and allocated) before it can be initialized. Otherwise, where would the initialization value be written?


The problem with

int a = a + 1;

isn't that a doesn't exist, it's that the value of a is indeterminate when you use it in a + 1.

For some types an indeterminate value could contain a trap representation, and if that happens it leads to undefined behavior.


A small note about the sizeof operator: The only time it's not evaluated by the compiler itself at compile-time is for variable-length arrays.

Some programmer dude
  • 400,186
  • 35
  • 402
  • 621