When working with floating-points one should use
a <= 0.0
instead of
a == 0.0
to make sure that you get the desired behaviour, as there is the problem with round-off errors when using floating-point variables.
But when using other variable types like int
could it be useful to do the same? Like you have a for loop iterating over an int
variable (like an index) and when it gets to a number it should do something. Should you then set the comparison to be >=
instead of ==
when it should result in the same output? Like could there ever be a case where the ==
is not evaluated in the following case:
for (int i = 0; i < 10; i++)
{
if (i == 5)
{
break;
}
}
And that it would be "safer" to do the following instead:
for (int i = 0; i < 10; i++)
{
if (i >= 5)
{
break;
}
}
If there is no difference between the two when it comes to coding "safe" is there any performance or readability difference or other thing that can make one choose between the ways to code?
Tried to google this but couldn't find anything stating either way. But that might have to do with the problem with searching for operators.
Am I too paranoid for asking this?