I have 6 GB memory in my machine (Windows 7 Pro 64 bit) and in R, I get
> memory.limit()
6141
Of course, when dealing with big data, memory allocation error occurs. So in order to make R to use virtual memory, I use
> memory.limit(50000)
Now, when running my script, I don't have memory allocation error any more, but R hogs all the memory in my computer so I can't use the machine until the script is finished. I wonder if there is a better way to make R manage memory of the machine. I think something it can do is to use virtual memory if it is using physical memory more than user specified. Is there any option like that?