I have been looking at solutions posted online on how to increase the memory limit for R but these solutions seem to only work for windows or linux systems.
I am using Mac Mojave version 10.14.5, 8GB memory, 2.3 GHz Intel Core i5. My R studio is 64bit, version 1.1.453
Here's the report from the gc function:
> gc()
used (Mb) gc trigger (Mb) limit (Mb) max used (Mb)
Ncells 6453699 344.7 11897884 635.5 NA 11897884 635.5
Vcells 44221701 337.4 179064532 1366.2 7168 219267441 1672.9
I am wondering why the limit for the Ncells and Vcells are so low -- 635.5Mb and 1672.9Mb? Does this mean R is only currently using that amount of memory? This is my suspicion and so I want to increase its limit.
What I am trying to do is: Merge a dataframe with 227,795 rows with another dataframe that has the same number of rows but with different columns. This is giving me an error:
Error: vector memory exhausted (limit reached?)
This error is also occurring when I try to build a large matrix of distances between 227,796 sets of coordinates.
Does anyone have any solutions to increase R's memory limit in mac? It would be great if there is a memory.limit()
version for Mac.