-4

According to the answer from my faculty malloc dynamically allocates memory, Then why the output shows the same size allocated to both normal variable and malloc();. I am a newbie to programming, so I guess you would answer my question the way that a newbie can understand.

#include<stdio.h>

int main()
{
    int a,b;
    a = (int *) malloc(sizeof(int)*2);
    printf("The size of a is:%d \n",sizeof(a));
    printf("The size of b is:%d \n",sizeof(b));
    return 0;
}

Output:

The size of a is:4
The size of b is:4
Sander De Dycker
  • 16,053
  • 1
  • 35
  • 40
Ravikiran
  • 512
  • 3
  • 14

2 Answers2

7
  1. Malloc is used on a pointer. You are declaring an integer int a. This needs to be changed to int *a

  2. The sizeof() operator will not give the no of bytes allocated by malloc. This needs to be maintained by the programmer and typically cannot be determined directly from the pointer.

  3. For int *a, sizeof(a) will always return the size of the pointer,

    int *a;
    printf("%zu\n",sizeof(a));   // gives the size of the pointer e.g. 4
    a = malloc(100 * sizeof(int));
    printf("%zu\n",sizeof(a));    // also gives the size of the pointer e.g. 4
    
  4. You should always remember to free the memory you have allocated with malloc

    free(a);
    

Edit The printf format specifiers should be %zu for a sizeof() output. See comments below.

Rishikesh Raje
  • 8,556
  • 2
  • 16
  • 31
  • 1
    Note: You should be using `%zu` to print `size_t`, the type `sizeof` returns – Spikatrix Jul 23 '19 at 07:18
  • In addition to @Spikatrix comment: Using bad type format specifier invokes undefined behaviour. 1.: `size_t` is an *unsigned* type, so at very least, `%u` would be needed. Where `size_t` is defined as unsigned int, we'd be fine, but on most modern systems (64-bit architecture!), size_t would be defined to `unsigned long` (64-bit linux) or `unsigned long long` (64-bit Windows), so you'd need different format specifiers on each type of system. That's why `z` length modifier was introduced, that resolves to correct length on any machine... – Aconcagua Jul 23 '19 at 07:54
  • Thanks @Aconcagua for the detailed inputs. I do understand that ideally the `%zu` should be used. But if not used, the `unsigned long` would be converted to a `signed int`. Not sure how this is undefined behaviour for a value of 4. – Rishikesh Raje Jul 23 '19 at 09:07
  • @RishikeshRaje It is undefined behaviour according to the standard, independent of sizes actually (by accident) matching or not. And if you consider 64-bit linux, unsigned long has a size of **8** bytes, so at least on this system, what you tell in the format (use 4 bytes) is a lie... – Aconcagua Jul 23 '19 at 11:37
  • C11, 7.21.6.9: *'If a conversion specification is invalid, the behavior is undefined. 282) If any argument is not the correct type for the corresponding conversion specification, the behavior is undefined.'* Sure, C11 was superseeded by C18 (don't have a version of available), but that passage shouldn't have changed, at worst moved. – Aconcagua Jul 23 '19 at 11:49
  • @Aconcagua - The value of `sizeof (a)` is 4. For `printf` and `fprintf` the following apply C11, 7.21.6.1.2 *The fprintf function writes output to the stream pointed to by stream, under control of the string pointed to by format that specifies how subsequent arguments are converted for output* i.e. on this machine, the statement is equivalent to `printf("% *d\n",(int32_t)sizeof(a));`, Since `sizeof(a)` has a value of less than `2^31` this operation is not undefined. – Rishikesh Raje Jul 23 '19 at 13:28
  • @RishikeshRaje Pretty wrong. The format specifier refers *only* to the *type* of the variable being used, not the value held, which is entirely irrelevant. Assume a *big-endian* machine, with 64-bit long value: `long n = 7; printf("%d", n);` – not considering the UB invoked here, then most likely `printf` will only consider the next four bytes of the arguments, which, as big-endian, are the most significant ones of the argument, and output would be 0 (whereas it would have been 7 on a little endian machine). If the *value* was relevant, result would have had to be 7 on both machines... – Aconcagua Jul 23 '19 at 15:45
  • @Aconcagua - On further reading, I see your point. This can lead to undefined behaviour especially for 64 bit systems. – Rishikesh Raje Jul 24 '19 at 05:19
2

You declare and define both variables as int. Nothing else has an influence on the value of sizeof().

int a,b;

This assigns a value to one of those ints which which is very special, but it does not change anything about the fact that a remains an int (and your cast is misleading and does not do anything at all, even less to change anything about a).

a = (int *) malloc(sizeof(int)*2);

In order to change above line to something sensible (i.e. a meaningful use of malloc) it should be like this:

int* a;
a= malloc(sizeof(int)*2);

I.e. a is now a pointer to int and gets the address of an area which can store two ints. No cast needed.
That way, sizeof(a) (on many machines) will still be 4, which is often the size of a pointer. The size of what it is pointing to is irrelevant.

The actual reason for using malloc() is determined by the goal of the larger scope of the program it is used for. That is not visible in this artificially short example. Work through some pointer-related tutorials. Looking for "linked list" or "binary tree" will get you on the right track.
What programs which meaningfully use malloc have in common is that they are dealing with data structures which are not known at compile time and can change during runtime. The unknown attributes could simply be the total size, but especially in the case of trees, the larger structure is usually unknown, too.

There is an interesting aspect to note when using malloc():
Do I cast the result of malloc?

Yunnosch
  • 26,130
  • 9
  • 42
  • 54
  • oh, Then why do we use malloc( ), why not char a[50]; something like that? I know this is a basic question but still, I am a newbie. So I hope you understand. – Ravikiran Jul 23 '19 at 07:06
  • You use `malloc` when you want to create an array during the execution of the code (for example when you want to create an array with a size from an input). Your array `char a[50]` has a fixed size, because the memory is allocated during the compilation of the code. You can´t do something like `char a[SizeFromInput]`. You need to malloc this memory `char* a = malloc(sizeof(char) * SizeFromInput)`. – Kampi Jul 23 '19 at 07:09
  • Side note: Oppinions on casting the result of malloc [differ](https://stackoverflow.com/a/14879184/1312382); the party in favour of is a minority, but a notable one. I personally recommend reading the arguments of both sides and making up your own mind on the issue. If you ever write a function in a header (not that it would be a good idea), *do* cast, otherwise the header wouldn't be usable from C++. In any case: whatever your own mind is/will be, stay consistent with the style of the existing code you work on. – Aconcagua Jul 23 '19 at 08:04
  • @Kampi thank you so much, now its crystal clear for me. Now I am capable of teaching the actual concept of malloc() to my faculty. – Ravikiran Jul 23 '19 at 14:52