At some locations within a program, it may be possible for a variable to be in either of the following conditions:
Inputs that have been received have caused the variable to be written, and will also cause code to use its value in future.
Inputs that have been received have caused the variable not to be written, but will also cause code not to make any use of its value.
Some compilers will squawk if they see that there exist inputs which would cause a variable to not to get written, and there exist inputs which would cause the variable to be read. Even if the only input which would cause the variable to get read would also cause it to get written, the compilers may not realize that. While it would be possible to silence the compiler's complaints by unconditionally initializing the variable, machine code which initializes the variable with a value which can never actually get used would serve no functional purpose. A source code construct which would silence the compiler's complaints but not result in the compiler generating useless machine code may thus be preferable to one which would need to generate useless machine code. On some compilers, a self-initializing declaration may serve such a purpose.
If a type has trap representations, there may be no practical alternative to writing an initialization that a compiler will likely turn into useless machine code, but if a type doesn't have trap representations there is no particular reason why a quality compiler should force programmers to ask for useless code they don't want and don't need. Rather than try to have different rules for platforms where various types do and don't have trap representations, however, the authors of the Standard defer to implementers' judgments as to what sorts of behavior would be sensible on for a particular target platform and application field.
If on some platform a compiler couldn't allow programmers to omit initialization for variables which are copied but otherwise unused without having to generate likely-redundant initializations itself, it would make sense to require the programmer to include initializations in all situations that could actually occur. On another platform where copying a meaningless value would simply cause the destination to hold a meaningless value, however, it would generally make more sense to allow a programmer to omit initialization in cases where none of the copies would ever end up getting used for any real purpose. The authors of the Standard make no effort to specify what would make sense in any particular case, because they expect that people writing implementations should be capable of exercising good judgment.
The best approach for dealing with this situation would probably be to write a macro which accepts a variable, and whose expansion will initialize the variable to zero (for implementations whose authors judge that there is more value in allowing the compilers to jump the rails when an uninitialized variable is copied, even if none of the copies are really "used" for anything, than there would be in guaranteeing that the mere act of copying a variable will have no side-effects) or else do nothing (when using an implementation which will stay on the rails even without the useless initialization).