In the Haskell Wiki article on prime numbers, the following implementation of the Sieve of Eratosthenes is described:
primes = 2 : 3 : minus [5,7..] (unionAll [[p*p, p*p+2*p..] | p <- tail primes])
When doing...
primes !! 2
... how does Haskell recognize in this specific situation not to try all p
's in the tail of primes
(a.k.a [3..]
), but instead only does a set minus with 3
?
In other words: how does Haskell know that any of the other p
's (or multiples thereof) will not match 5
, which is the eventual answer. Is there a good rule of thumb to know when the compiler is smart enough to handle infinite cases?