Yes, C++ compilers are allowed to generate efficient code, and an ideal C++ compiler would be capable of making the code as efficient as possible.
An ideal compiler would make use of every optimization technique you can think of. Unlike a real compiler, an ideal one is not subject to those pesky limitations of time and space (and human ingenuity), so it would implement even the most outlandish optimization ideas. Optimizations that are currently possible in another language (such as Lisp) are not outlandish and certainly fall within the capabilities of an ideal C++ compiler.
I would think that the above applies to all compiled languages, not just C++. However, the C++ standard does make this explicit with the as-if rule, which establishes that the standard mandates only the observable behavior; compilers are allowed to achieve this behavior however they see fit. In fact, as far as the standard is concerned, a compiler could generate a magic crystal ball and be compliant, as long as the crystal ball causes the correct observable behavior.
Truly, the C++ standard does not prohibit speed.
OK, invoking magic might be too much hyperbole for some people's tastes. For an extreme example more grounded in reality, consider sorting. Suppose there is a function that sorts an array in such a way that there are no observable side effects; the only observable behavior from this function is that the array transitions from arbitrary order to sorted. This much should feel quite familiar to many readers.
If a C++ compiler is given this function, then the only mandate is that the generated machine code must sort the array with no side effects. Think about that. The machine code for that function must preserve the observable behavior; it does not have to conform precisely to the code that the programmer wrote. The compiler could, in theory, replace an implementation of bubble sort with one of heap sort. It has the same observable behavior.
As far as I know, no one developing a compiler has considered an optimization at this level (nor should they, in my opinion). However, it is explicitly allowed by the C++ standard. This demonstrates how far the as-if rule can be pushed. Any valid optimization that can be imagined is allowed. An ideal compiler would implement every optimization that is possible (as opposed to realistic compilers, which at best can only strive to implement those optimizations that are reasonable). In particular, any optimization Lisp can do can also be done by an ideal C++ compiler.
For the follow-up question, yes, it would be bad practice, but for a reason other than the one you proposed.
A manual code optimization is very likely to fall under the banner of premature optimization, "the root of all evil". Writing premature optimizations is bad code practice.
If your manual optimization is not premature, then (either it is routine or) there has been performance testing to establish the need for it. The same testing routine could determine whether or not the manual optimization achieved better results than your compiler, rendering the follow-up question moot.
Furthermore, since no ideal compiler exists, it would be a bad idea to let real code be influenced by the capabilities of an ideal compiler.