0

Flip a coin. Success, you win 100, otherwise you lose 50. You will keep playing until you have money in your pocket a. How can the value of a at any iteration be stored?

a <- 100
while (a > 0) {
  if (rbinom(1, 1, 0.5) == 1) {
    a <- a + 100
  } else {
    a <- a - 50
  }
}

As a final result, when the while loop ends, I would like to be able to look at the value of a for each iteration, instead of just the final result. I consulted the post on Counting the iteration in sapply, but I wasn't able to apply it to this case.

Community
  • 1
  • 1
Worice
  • 3,847
  • 3
  • 28
  • 49

1 Answers1

7

Store the initial value of a in a second vector, and append the new value of a at each iteration.

a <- pocket <- 100
while (a > 0) {
  if (rbinom(1, 1, 0.5) == 1) {
    a <- a + 100
  } else {
    a <- a - 50
  }
  pocket <- c(pocket, a)
}

Of course a vectorised approach may be more efficient, e.g.:

n <- 1000000
x <- c(100, sample(c(100, -50), n, replace=TRUE))
cumsum(x)[1:match(0, cumsum(x))]

But there's no guarantee you'll run out of money within n iterations (in which case you receive an error and can just look at x to see the realised trajectory).


EDIT

In response to concerns voiced by @Roland, the following approach avoids reallocation of memory at each iteration:

n <- 1e6
a <- rep(NA_integer_, n)
a[1] <- 100L # set initial value (integer)
i <- 1 # counter
while(a[i] > 0) {
  # first check whether our results will fit. If not, embiggenate `a`.
  if(i==length(a)) a <- c(a, rep(NA_integer_, n))
  if (rbinom(1, 1, 0.5) == 1) {
    a[i+1] <- a[i] + 100L
  } else {
    a[i+1] <- a[i] - 50L
  }
  i <- i + 1  
}
a[seq_len(i)]
Community
  • 1
  • 1
jbaums
  • 27,115
  • 5
  • 79
  • 119
  • Yes indeed. With the current parameters is quite common that you will play "forever". Anyway, your suggestion is perfect. Can you suggest me didactic material to better comprehend vectorized operations? By now, my knowledge on the matter is really basic. – Worice Jan 04 '16 at 01:35
  • 2
    @Worice - I can't think of any resources off the top of my head. The main time saver above (at least for big `n`) is the use of `sample`. e.g. for `n <- 1e7`, `sample(0:1, n, replace=TRUE)` takes about 0.2 sec, while `for(i in 1:n) sample(0:1, 1)` takes about 24 sec. We call `sample` once instead of 10 million times. Similarly we could have used `rbinom(n, 1, 0.5)` (with a bit of extra post-processing work). Vectorised operations can very frequently replace loops, but knowing when and where largely comes down to familiarity with the available functions (read the function docs). – jbaums Jan 04 '16 at 01:48
  • Your example on the difference between `sample` and `for` is straightforward. Now I have a better grasp on the potential of a vectorized approach. Thank you. – Worice Jan 04 '16 at 01:52
  • 2
    @Worice [Worth reading](http://www.noamross.net/blog/2014/4/16/vectorization-in-r--why.html); the author lists resources on vectorization at the end of the article as well. – nrussell Jan 04 '16 at 01:54
  • 1
    This article makes light on the matter @nrussell, it is really straightforward even for a beginner, thanks a lot. – Worice Jan 04 '16 at 02:04
  • 2
    Please do not grow an object with each iteration. If you don't know the final size, pre-allocate a guesstimate, test in each iteration if it has still enough room and grow by chunks if is hasn't. – Roland Jan 04 '16 at 08:22