I am wondering if any C++ compilers (optimizers) guarantee that a code that is highly "decorated" with some meaningless operations (e.g. if (true) or do-while(false) pattern) will be generated (when compiling with all the optimizations for speed) into exactly the same executable that would be, if the source code didn't contain these "decorations".
I hope my question is clear - if not I will try to rephrase it is some way - anyway here is, why I am asking it, to make it more clear: I have a 'legacy' piece of c++ code that uses macros quite heavily. Some of them are expressions, so it might be good to wrap them into do-while(false), although these macros themselves contains instructions like "break" or "continue", so I came even further to "if(true)[MacroBody]else-do{}while(false)". So far, so good, maybe not very clean but all should be fine, however I noticed that my executable (MSVC, all optimizations for speed) grows by 2kB after such "decorating".
(My feeling is that the compiler, after making some initial optimizations, was glad enough by what it did so far and stopped making further optimizations. The bad thing is that I was unable to minimize the case - the difference in executable replicated only on real-life case. That's why I would like to ask, if any of you, more specialized in compiler construction, have some knowledge about it. If my perception was right, currently it might work a bit heuristically - like compiler coming to a conclusion: "Looks like I did quite many optimizations, it may be just enough")