In summing the first 100,000,000
positive integers using the following:
import numpy as np
np.arange(1,100000001).sum()
I return: 987459712
, which does not match the formula: N(N+1)/2
for N=100000000
. Namely, the formula returns 5000000050000000
.
Before posting, I wrote the following, which returns True
:
np.arange(1,65536).sum() == ((65535+1) * 65535)/2
However, the number 65536
seems to be a critical point, as
np.arange(1,65537).sum() == ((65536+1) * 65536)/2
returns False
.
For integers greater than 65536
the code returns False
, whereas integers below this threshold return True
.
Could someone explain either what I've done wrong in calculating the sum, or what is going on with the code?