Obviously,
for ( int i = 0; i != n; ++i )
works just the same as
for ( int i = 0; i < n; ++i )
But is it normal for a developer to feel uneasy about the former? I'm always afraid that some freak accident will make the i
in the former loop "miss" the n
. Whereas, in the second loop, if i
somehow stops incrementing correctly then the <
operator will act as a barrier.
This made me think today about the question of which is actually more possible to fail given the full spectrum of freak accidents. Someone here with a knowledge of compiler/hardware stuff might have an answer for me.