4

I am able to compile and run the following code with gcc 4.4.7.

File: m.c

#include <stdio.h>

int main()
{
    printf("%d\n", f(1, 2, 3));
}

File: f.c

int f(int a, int b)
{
    return a + b;
}

Output:

$ gcc m.c f.c && ./a.out
$ 3

When the function f() is defined in the same file, the compiler throws an error as expected. My guess is that the compiler can not detect erroneous usage of functions between compilation units. But should not the linker be able to detect it? Does the standard specify the expected behavior?

Please note that this is different than declaring a function without any parameters, which works even inside a single file. (Why does gcc allow arguments to be passed to a function defined to be with no arguments?).

I am using gcc (GCC) 4.4.7 20120313 (Red Hat 4.4.7-11) and GNU ld version 2.20.51.0.2-5.42.el6 20100205.

Community
  • 1
  • 1
Sourav
  • 379
  • 7
  • 13
  • C isn't type checked or type safe like that :-( – Kerrek SB Dec 05 '14 at 20:20
  • For compatibility with C programs written before 1989. However, the code above invokes undefined behavior. "If the expression that denotes the called function has a type that does not include a prototype ... If the number of arguments does not equal the number of parameters, the behavior is undefined." – Raymond Chen Dec 05 '14 at 21:06
  • 1
    To avoid this problem, add a header `f.h` with a prototype `int f(int a, int b);`, and `#include` that header in `f.c` and in any source file that calls `f`. And invoke gcc with arguments that make it warn about calls to undeclared functions, either `-std=cXX` for code that doesn't need GNU extensions, or `-std=gnuXX` for code that does (where `XX` is `99` or `11`). – Keith Thompson Dec 05 '14 at 22:44
  • in C (and several other languages) the passed parameters are pushed onto the stack, last parameter first by the calling function, then the pc, then the auto variables from the called function. Therefore, from the view of the called function, there is no difference in what it sees on the stack. This method of pushing the last parameter first onto the stack is what enables va-arg type functions, like printf, to work correctly. – user3629249 Dec 05 '14 at 23:32
  • the extra parameters are not caught at compile time because the code is missing the (should always be used and certain parameters to gcc will force there being used) prototype for each function. – user3629249 Dec 05 '14 at 23:34

2 Answers2

5

gcc currently compiles with -std=gnu89 by default (not sure about 5.x, but it's true for the versions before). In C89 (GNU89 is almost a C89-superset), if a function is called without a declaration being visible, it is assumed to be declared as

extern int f();

a function with external linkage, returning int, and accepting an unspecified (but fixed) number of default-promoted arguments.

This was considered a design mistake by many, marked obsolescent in C89, and eventually removed in C99. Gcc gives a warning for implicit function declarations by default.

If you call a function with the wrong type or number of arguments, the behaviour is undefined, a diagnostic is required only if a prototype-declaration is in the scope of the call. The standard imposes no requirements on what an implementation does, the linker would be allowed to fail (but usually doesn't).

Use header files with prototypes to get a warning.

Usually, a C compiler compiles source files separately into object files, where symbols for functions are stored without any information about their arguments' types, so the linker cannot check them.

mafso
  • 5,433
  • 2
  • 19
  • 40
  • In addition, I suggest always compiling with the -Wall option, it will hightlight this problem and also many other common mistakes. – roy Jul 16 '16 at 23:22
2

EDIT:

I think I misunderstood at first:

"When there is no function declaration gcc has no idea what to expect during the compile time phase, and as long as it finds the function during the linking phase everything will work. This 'error' has to be caught during the compile phase because it is not technically an error at all in the linking phase."

OLD ANSWER:

This is a feature and not a bug. When you call a function the arguments get pushed onto the stack. If you don't use them, no big deal generally.

You can even plan on having an unknown number of arguments. Here is simple example of a custom printf style function for logging:

void Debug_Message(uint32_t level, const char *format, ...)
{
    char buffer[256];

    //check level and do stuff

    va_list args;
    va_start(args, format);
    vsnprintf(buffer, sizeof(buffer), format, args);

   //buffer now contains data as if we did an sprintf to it
}

This would be called just like printf, could be:

Debug_Message(1, "%d%d%d", 1, 2, 3);

could be:

Debug_Message(1, "%d", 1);
RobC
  • 502
  • 4
  • 17
  • You're describing the (harmless) **run-time** effect. OP asks how come there is no **compile-time** error. – barak manos Dec 05 '14 at 21:06
  • 1
    Whoops, in this case I'd have to say it is because there is no function declaration. When there is no function declaration gcc has no idea what to expect during the compile time phase, and as long as it finds the function during the linking phase everything will work. This 'error' has to be caught during the compile phase because it is not technically an error at all in the linking phase. @barakmanos – RobC Dec 05 '14 at 21:23
  • That was my thought as well, though as far as I know, when there is no declaration, the compiler implicitly assumes `int f()` or `int f(int)`, none of which matching the call to `f(1, 2, 3)`. So I would still expect a compilation error (or warning). That being said, I'm not sure what the standard "has to say about this". In any case, if you think that's the correct answer then maybe you should update yours accordingly. – barak manos Dec 05 '14 at 21:28
  • If the compiler emits type information into the object files, the linker may be able to detect that there is no matching definition of `f`. Does the C object file format has type information for function arguments? – Sourav Dec 05 '14 at 22:29
  • As far as the linker is concerned I don't think definitions matter. The stack that is pushed when entering a function is just a chunk of bytes, the definition defines which bytes are for which variable. That is why there is the '...' argument definition to tell the compiler we have no idea how many variables there will be. I believe the strict definition you define is merely a guide to help prevent errors and to autoassign variables in the stack. @Sourav – RobC Dec 05 '14 at 22:45