1

I just started learning r and was experimenting a bit when I came across this issue.

my code was:

options(digits=20)

i <- 0:20

10^i +1

and my output was the following :

##  [1]                     2                    11                   101
##  [4]                  1001                 10001                100001
##  [7]               1000001              10000001             100000001
## [10]            1000000001           10000000001          100000000001
## [13]         1000000000001        10000000000001       100000000000001
## [16]      1000000000000001     10000000000000000    100000000000000000
## [19]   1000000000000000000  10000000000000000000 100000000000000000000

why are the last 5 values missing a one at the end? sorry if this is all messy. im new here too.

thanks for the help.

Jamie
  • 11
  • 1

1 Answers1

1

I'm not sure if the 'hints' others provided to you via the comments were helpful or not, but basically what you are seeing is the limits of storing very large numbers in a computer. At some point, you reach a point where the computer can represent the number any longer. This also applies to very small numbers. This is sometimes called 'overflowing' or 'underflowing'. The point of the example

identical(10^15, (10^15 + 1))
identical(10^16, (10^16 + 1))

is to point out that 10^15 can be handled correctly, but 10^16 cannot, you've reached the breaking point. The help page at ?.Machine discusses some of the technical details. There is a little more info here but all these discussions get technical quickly.

Bryan Hanson
  • 6,055
  • 4
  • 41
  • 78
  • This isn't over/under-flowing because you can store numbers approximately 2^(+/-308) in double floating point representation (see `.Machine$double.xmax` and `.Machine$double.xmin`). This is a precision issue, which is related to the accuracy of the base-10 number represented in a base-2 numeral system (see `.Machine$double.eps` and `.Machine$double.neg.eps`). – Joshua Ulrich Jan 18 '16 at 02:30
  • @JoshuaUlrich Thanks for the clarification. – Bryan Hanson Jan 18 '16 at 02:31