0

I was using plm on a dataframe of 1922550 rows and 17 columns(about 200mb), the plm function works perfectly on it and so does vcovHC or vcovDC on some smaller subsets of the orginal dataframe, however when I tried using vcovHC to caculate double clustered standard errors, the error of not enough memory occured.

Please let me know if there is anything I can do to get the robust standard errors?

BTW my computer has an i7-8750hq processor and 32GBs of ram and I have been running all these codes on x64 version of R. I also run them on a 2700x computer and it took forever to process.

vcovHC(test, method = "arellano", cluster = "time")
Helix123
  • 3,502
  • 2
  • 16
  • 36
  • 1
    Possible duplicate of https://stackoverflow.com/questions/5171593/r-memory-management-cannot-allocate-vector-of-size-n-mb – akrun May 09 '19 at 02:19
  • I know that R is relatively weak at handling large data and errors like "cannot allocate vector of size ** Gb" happens frequently, but from my experience, for many cases it was caused by errors in codes like endless loops rather than by shortage of memory, especially when the number is crazily high like in my case – Freeman Milos May 09 '19 at 02:24
  • what's more the stargazer function also failed to handle the resulted plm object, however it works fine for plm object obtained from running the same model on smaller subsets of the orginal dataframe. – Freeman Milos May 09 '19 at 02:31
  • sorry I forgot to mention that cluster by group also works fine, but cluster by time returns error – Freeman Milos May 09 '19 at 03:59
  • Possible duplicate of [R memory management / cannot allocate vector of size n Mb](https://stackoverflow.com/questions/5171593/r-memory-management-cannot-allocate-vector-of-size-n-mb) – NelsonGon May 09 '19 at 04:56
  • Since version 2.4-0, the the `vcovXX` functions use less memory, so you might want to try again with a more recent version of the package. – Helix123 Aug 09 '21 at 20:36

0 Answers0