9

I have been looking at solutions posted online on how to increase the memory limit for R but these solutions seem to only work for windows or linux systems.

I am using Mac Mojave version 10.14.5, 8GB memory, 2.3 GHz Intel Core i5. My R studio is 64bit, version 1.1.453

Here's the report from the gc function:

> gc()
           used  (Mb) gc trigger   (Mb) limit (Mb)  max used   (Mb)
Ncells  6453699 344.7   11897884  635.5         NA  11897884  635.5
Vcells 44221701 337.4  179064532 1366.2       7168 219267441 1672.9

I am wondering why the limit for the Ncells and Vcells are so low -- 635.5Mb and 1672.9Mb? Does this mean R is only currently using that amount of memory? This is my suspicion and so I want to increase its limit.

What I am trying to do is: Merge a dataframe with 227,795 rows with another dataframe that has the same number of rows but with different columns. This is giving me an error:

Error: vector memory exhausted (limit reached?) 

This error is also occurring when I try to build a large matrix of distances between 227,796 sets of coordinates.

Does anyone have any solutions to increase R's memory limit in mac? It would be great if there is a memory.limit() version for Mac.

Phil
  • 7,287
  • 3
  • 36
  • 66
imguessing
  • 377
  • 1
  • 3
  • 9
  • Suggested dupe: [R on MacOS Error: vector memory exhausted (limit reached?)](https://stackoverflow.com/q/51295402/903061) – Gregor Thomas Jun 25 '19 at 13:59
  • Also, other things you don't mention in the question so I want to check--what else do you have open and using memory? What does your activity monitor utility show? When you say you want to merge a data frame with 227,795 with another that has different columns, are you merging on anything?? If you're doing a cross join, that would be a 52 Trillion row result, which will exhaust the memory of much larger systems than yours. What are you merging on? How many duplicate keys are there in each? Are you using `data.table` for the most memory efficient joins, or something else? – Gregor Thomas Jun 25 '19 at 14:02

0 Answers0