8

In C99, the compiler will issue a warning if a function is called before its declaration. For example, this will cause a warning:

int sum(const int k) {
  return accsum(k, 0);
}


int accsum(const int k, const int acc) {
  if (k == 0) {
    return acc;
  } else {
    return accsum(k-1, k + acc);
  }
}

int main() {
  int x = sum(3);
  return 0;
}

The answer I've seen after Googling is declaration is needed so that the compiler can check the parameter types to avoid errors. But why can't the compiler just go find the function definition of accsum before executing accsum when accsum is called from within sum?

Tan Wang
  • 811
  • 1
  • 6
  • 16
  • 1
    because the spec says that we need to – Ryan Mar 01 '15 at 01:13
  • Because otherwise the compiler will not know how to generate the code that calls the function. – Iharob Al Asimi Mar 01 '15 at 01:15
  • I'm sure I've seen this question before somewhere, but the SO search isn't helping me find the duplicate. – Jeffrey Bosboom Mar 01 '15 at 01:17
  • 1
    @iharob can you elaborate? How does the compiler generate the code that calls the function? – Tan Wang Mar 01 '15 at 01:17
  • 1
    I think you'll find this [answer](http://stackoverflow.com/questions/4757565/c-forward-declaration) regarding forward declarations illuminating @JeffreyBosboom but I wouldn't say it was a duplicate, – ljetibo Mar 01 '15 at 01:17
  • In addition to my [comment on a currently incorrect answer below](http://stackoverflow.com/questions/28788968/why-do-we-need-to-declare-functions-before-using-them-in-c#comment-45855145), the `pow` function is already compiled and is basically 1s and 0s by the time you get to the linker step, so type info is unavailable. Even before that, the compiler needs to know what to do with undeclared functions; it can't just look up the types in the library in which it's declared because the linking phase that lets the compiler know which libraries to actually search hasn't even been started yet. –  Mar 01 '15 at 03:45
  • It is like the "Miranda" rule - if you do not provide one, one will be provided for you. Very often, the assumptions on the one provided are incorrect. For speed, many of the older C compilers were one pass compilers so if there wasn't a declaration, they had to make assumptions. Basically, if you want it done correctly, do it yourself i.e. give it a forward declaration. – cup Mar 01 '15 at 09:47

2 Answers2

11

Actually, it is not required that a function be declared before use in C. If it encounters an attempt to call a function, the compiler will assume a variable argument list and that the function returns int.

The reason modern compilers give warnings on an attempt to call a function before seeing a declaration is that a declaration allows the compiler to check if arguments are of the expected type. Given a declaration, the compiler can issue warnings or errors if an argument does not match the specification in the function declaration. This catches a significant percentage of errors by the programmer.

As to why the compiler doesn't look ahead for a definition, there are a few contributing factors. Firstly, there is the separate compilation model of C - there is no guarantee that there will be a function definition to be found within the current compilation unit (i.e. source file). Second, it allows the compiler to work in one pass, which increases performance. Third, it does encourage the programmer to actually declare functions (e.g. in header files) which allows for reuse with a separate compilation model. Fourth, it increases the chances of a compiler actually being able to function on machines with limited memory resources.

Rob
  • 1,966
  • 9
  • 13
  • 1
    Basically you have two points here (which you duplicate, possibly to try and add more weight to them): efficiency and unprototyped functions. I find both irrelevant. Sure, if you want to compile GnuChess on your WiFi wristwatch, single-pass compilation will be a life saver :). As for "encouraging" people to declare function, this is yet another circa 1970 leftover. The safe thing to do would be to handle undeclared functions as errors, but I suppose this was left out for upward compatibility reasons. – kuroi neko Mar 01 '15 at 11:02
  • 1
    I didn't suggest the points I listed were unrelated to each other, any more than I suggested I gave a complete answer. Function declarations (in the sense of being able to specify types of arguments in the declarations) are not a 1970 leftover - they were introduced in the 1989 ANSI (later 1990 ISO) standard. In any event, the design philosophy of C traded off safety to achieve various forms of efficiency. You, today, may deem such things irrelevant, but that doesn't change the reasons that C does things the way it does. – Rob Mar 01 '15 at 12:35
  • 1
    Of course prototyping comes from ANSI C. Unprototyped functions are the 1970 leftover. As for efficiency, I really can't see how keeping K&R prototypes is "efficient", except for allowing vintage code to compile without a cleanup. And yet again, compiler speed nowadays is a moot point. – kuroi neko Mar 01 '15 at 13:19
  • 2
    Actually, the first paragraph is **wrong**. C programming language, i.e. ISO/IEC 9899:2011 (and the long-obsolete 9899:1999 mentioned in the question for that matter), does *not* allow implicit function declaration, end of discussion. – Antti Haapala -- Слава Україні Apr 08 '18 at 12:51
6

This is just a leftover from cicra 1970 C.

Forcing the programmer to declare functions before using them allows the compiler to work in one pass (i.e. read the code only once).

40 years ago it was important to limit compilation time. Today it's just an annoyance.

Edit:

Of course you must declare external function prototypes, or else the compiler has no way to know which parameters and return value to expect. This has nothing to do with the order of declaration.

The safe and convenient thing to do would be to consider undeclared external functions as errors, and dispense the programmer from forward-declaring prototypes within the same compilation unit.

As for allowing to use unprototyped functions, it is yet another circa 1970 leftover. C99 does not allow implicit declarations anymore, but you can still use K&R style declarations (a prototype without argument parameters specification).

I suppose this dangerous possibility was preserved for upward compatibility reasons.

kuroi neko
  • 8,479
  • 1
  • 19
  • 43
  • 9
    That is wrong, not an annoyance at all. It is not just a time, it also memory consumption. For a small standalone program this may be annoyance, but for the large projects especially C++ this is very important to reduce a time and memory consumption. Now resources a very cheap, but they are limited to what you currently have. – loshad vtapkah Mar 01 '15 at 01:25
  • 4
    @loshadvtapkah what memory consumption? The symbol table that you were going to have anyway? – harold Mar 01 '15 at 01:51
  • If you don't `#include ` and try to use the standard `pow` function without the `double pow (double, double);` forward declaration, consider what happens when you do `int n = 20; n += pow (1, 9);`. The compiler can't find any record of `pow` anywhere, so there's nothing left to do except either force the compilation to fail or ignore the lack of a declaration and assume `pow` returns an `int`, which isn't right because suddenly n = n + 1**9 -> 20 + 1065353216 -> n=1065353236. Personally I prefer failure for that reason. Anyway, it's not merely a leftover. –  Mar 01 '15 at 03:47
  • @harold the more information it needs to keep for the second (and probably not the last pass) the more memory it needs. I didn't check what inside compilers today and now I do not check the memory consumptions, I just remember more than 10 years ago I was surprised that compilation of every C++ files from one of the open sourced project used about ~200 Mb of memory per process. I am sure this because of every little 'it's just an annoyance' and 'resources are chip', but in total it is grow. Fast development is good, but people stopped to think, they sure the computer should. This is wrong. – loshad vtapkah Mar 01 '15 at 10:45
  • 2
    @loshadvtapkah it needs that table anyway, the only difference is that it has to be prepared in an extra pass in the beginning instead of being built up as the parsing goes along. All the memory it's using is probably the optimization passes anyway, no a simply symbol table. I agree with the sentiment though, programmers have gotten lazy and have stopped thinking about resource usage.. – harold Mar 01 '15 at 10:49
  • 1
    For one thing the question is about C, not C++. Compared with the truckloads of CPU and memory needed to process templates, single-pass compilation gain is peanuts. As for @ChronoKitsune remark, it has nothing to do with forward declaration. Of course you must declare function prototypes. The question is: in which order? – kuroi neko Mar 01 '15 at 10:50
  • I remember somewhere in the committee discussions about type-punning through unions, that it should be allowed only if the union is visible (that is, a file-scope union declared _after_ the function doing the reinterpretation is not enough), with the argument that this would otherwise cause problems with one-pass compilation. I doubt there are any one-pass compilers doing type-based aliasing analysis for optimizations, but I take this as evidence that the committee still tries to keep C one-pass compilable. – mafso Mar 01 '15 at 10:54
  • 1
    well since the question is tagged C and C99, I answered for both. I'll edit my answer to make it clearer. – kuroi neko Mar 01 '15 at 11:17
  • 1
    @loshadvtapkah: This answer is right. Having to declare functions in header files is an annoyance. Modern programming languages often use mechanisms like import directives to avoid having to do that, and are also mostly faster to compile than C++, especially when using templates. – Étienne Mar 15 '15 at 00:03