I am reading an Excel sheet into R with XLConnect. It works very well. However, if I re-run the command (after changing values in the Excel file, for example), the function runs out of memory.
The file/sheet I am reading has 18 columns and 363 rows of numeric data.
The error message is
Error: OutOfMemoryError (Java): Java heap space
which appears on the second (identical) run of a readWorksheetFromFile
call. I am trying to produce an MWE by repeatedly running the input call from this example, but the error does not seem to be reproducible with that file.
The Excel file I am using has many interconnected sheets and is about 3 MB. The sheet that I am reading is also linked to others, but I have set useCachedValues = TRUE
.
It seems to me that, after executing the first call, the Java memory is not cleared. The second call then attempts to fill more data into memory, which causes the call to fail. Is it possible to force a garbage collection on the Java memory? Currently, the only solution is restarting the R session, which is not practical for my clients.
I know that expanding the Java memory might solve this, but that strikes me as a clumsy solution. I would prefer to find a way to dump the memory from previous calls.
I have also tried using the more verbose loadWorkbook
and readWorksheet
functions. The same error occurs.
Let me know if there is any other useful information you may require!