You can't predict the size of the output from the number of source files. A file could contain a dozen lines of simple code, or thousands of lines of spaghetti, or a vast data table. Object files (and static libraries, which are just collections of object files) contain metadata used for linking, which can be removed from the final executable. They may also contain a large amount of debugging information, if the compiler was configured to generate that. You're building with no optimisation, which will usually bloat the code quite a bit. Inline functions will appear in every object file that uses them; linking will resolve these duplicates, but simply lumping the objects together with ar
won't.
I just did the same experiment: I got 3.5Mb rather than 10Mb to start with; adding debug information (-g
) increased it to 8Mb; enabling optimisation (-O3
) reduced it to 1.6Mb, about the same size as the version installed on my computer. The dynamic library installed on my computer (generated with a linker, not ar
), is smaller still, about 666kb.
But why are you worrying about the size of the library? Are you planning to distribute a product as a precompiled static library? On floppy disks? Are you trying to develop on an Amiga 500? If you're developing for a limited platform, then worry about the size and runtime footprint of the final executables.