I know that there are some similar questions out there, and that the answer probably has to do with how R handles numbers. But I would like to understand the following:
In a custom function in R, I check whether the input variable (with two decimals) is part of a defined range for which the function is programmed. In this case the range is between 1
and 3
. If the input variable is not within that range, a stop()
rule is triggered.
The function works fine until it has the value 1.66. Clearly, 1.66 is between 1 and 3, so the logical statement should return TRUE
, but it doesn't:
1.66 %in% 1:3
#> [1] FALSE
1.66 %in% seq(1, 3, by = 0.01)
#> [1] FALSE
1.66 %in% seq(1, 3, by = 0.001)
#> [1] FALSE
1.66 %in% seq(1, 3, by = 0.0001)
#> [1] FALSE
1.66 %in% seq(1, 3, by = 0.00001)
#> [1] FALSE
Only when I use following logical test, TRUE
is returned:
1.66 < 3 & 1.66 > 1
#> [1] TRUE
Created on 2019-06-10 by the reprex package (v0.3.0)
This seems similar to the following question: How to store 1.66 in NSDecimalNumber
I would like to understand why the last logical question gives the correct answer and the others don't. Is it common practice to use statements like the latter to avoid these mistakes?