First, this question is NOT about
Error: cannot allocate vector of size n
I accept this error as a given and I am trying to avoid the error in code
I have a dataset of 3000+ variables and 120000 cases
All columns are numeric
I need to reset NA with zero
If I reassign values to 0 for the entire dataset, I get the memory allocation error.
So I am reassigning the values to zero one column at a time:`
resetNA <- function(results) { for (i in 1:ncol(results)) { if(i>10) { results[,i][is.na(results[,i])] <- 0 } } print(head(results)) }
After about 1000 columns, I still get the memory allocation error.
Now, this seems strange to me. Somehow memory allocation is incrementing after each loop. However, I don't see why this would be the case.
Also, I tried calling garbage collection function after each loop, I still got the memory allocation error.
Can someone explain to me how I can manage the variables to avoid the incremental increase in memory allocation (after all, the data frame size has not changed).