Why are compile-time known format-strings not optimized?
Some potential optimizations are easy for compilers/compiler developers to support and have large impact on the quality/performance of the code the compiler generates; and some potential optimizations are extremely difficult for compilers/compiler developers to support and have a small impact on the quality/performance of the code the compiler generates. Obviously compiler developers are going to spend most of their time on the "easier with higher impact" optimizations, and some of the "harder with less impact" optimizations are going to be postponed (and possibly never implemented).
Something like optimizing printf("%d\n", 0);
into a puts()
(or better, an fputs()
) looks like it'd be relatively easy to implement (but would have a very small performance impact, partly because it'd be rare for that to occur in source code anyway).
The real problem is that it's a "slippery slope".
If compiler developers do the work needed for the compiler to optimize printf("%d\n", 0);
, then what about optimizing printf("%d\n", x);
too? If you optimize those then why not also optimize printf("%04d\n", 0);
and printf("%04d\n", x);
, and floating point, and more complex format strings? Surely printf("Hello %04d\n Foo is %0.6f!\n", x, y);
could be broken down into a series of smaller functions (puts()
, atoi()
, ..)?
This "slippery slope" means that the complexity increases rapidly (as the compiler supports more permutations of format strings) while the performance impact barely improves at all.
If compiler developers are going to spend most of their time on the "easier with higher impact" optimizations, they're not going to be enthusiastic about clawing further up that particular slippery slope.