0

I am having issues reading function arguments with va_arg. When I run the following code with clang on my mac M1, I get what is expected. But when I run the same code with gcc-11 on the same machine. I get garbage.

#include <stdio.h>
#include <stdarg.h>

#define ND_indices size_t

void llF(ND_indices rank, ...){
    /* Get all the args of type ND_indices*/
    ND_indices arg_val;
    va_list ap;
    va_start(ap, rank);
    for (ND_indices i=1; i<=rank; ++i){
        arg_val = va_arg(ap, ND_indices); 
        printf("%lu \n",arg_val);
    }
    va_end(ap);
}

int main()
{   llF(10,3,2,4,2,3,4,4,6,7,6);
    //printf("Hello World");

    return 0;
}

output with clang (Apple clang version 13.1.6 (clang-1316.0.21.2.5)):

3
2
4
2
3
4
4
6
7
6

output with gcc-11 (installed from homebrew) on mac M1 (some garbage):

3
2
4
4294967298
4294967299
4294967300
4294967300
9331036219741569030
4294967303
4294967302

code compiled with following options gcc-11 test.c -Wall. I also noted that the code runs fine for 3 arguments even with gcc-11

ps: I tried running on a x86 machine with clang, gcc-10, icc and also tried other online c compilers. I do not get any issues. I am confused what is going wrong

Thanks for helping

Aditya Kurrodu
  • 325
  • 2
  • 9
  • 5
    You are passing `int`s to the function and then you try to extract `size_t`s with `va_arg`. The types must be (mostly) compatible, otherwise undefined behavior. – user17732522 Jul 07 '22 at 16:35
  • 2
    Also, `%lu` should be `%zu` for `size_t` – Ian Abbott Jul 07 '22 at 16:37
  • 1
    `#define ND_indices size_t` is a really bad practice. See https://stackoverflow.com/questions/36518029/proper-way-to-define-type-typedef-vs-define (and many others...) – Andrew Henle Jul 07 '22 at 16:52
  • I cannot reproduce it with gcc-10.2.1 on Debian x86-64. Run this in a debugger and see what is being passed on the stack. The arguments passed in, I assume, are int. Is sizeof(int) == sizeof(size_t) on the platform that fails? – Allan Wind Jul 08 '22 at 16:57

0 Answers0