So, it's been just over eleven years since a Debian maintainer infamously caused RNG seeds to become predictable by commenting out a usage of uninitialized data.
The issue generated a lot of heated discussion in online circles, with most of the focus seemingly being on criticizing the review process or attacking the developers in question.
However, I haven't been able to find any information on the actual thought process behind the segment being there in the first place. A lot of users argue that "worst case, it can't hurt" - however, that seems utterly counter-intuitive to me.
After all, reading from uninitialized memory invokes undefined behavior which, infamously, can cause nasal demons, run nethack or format your hard drive. Thus, it seems to me that introducing this sort of logic into any program - let alone a crypto library - puts you one aggressive compiler optimization away from utter disaster.
Thus, my question(s):
- Am I misunderstanding something? Is there a reason why this is actually not invoking UB, but well-defined under the standard?
- If it does in fact invoke UB, why was this behavior originally included in OpenSSL?