I was using plm
on a dataframe of 1922550 rows and 17 columns(about 200mb), the plm
function works perfectly on it and so does vcovHC or vcovDC on some smaller subsets of the orginal dataframe, however when I tried using vcovHC to caculate double clustered standard errors, the error of not enough memory occured.
Please let me know if there is anything I can do to get the robust standard errors?
BTW my computer has an i7-8750hq processor and 32GBs of ram and I have been running all these codes on x64 version of R. I also run them on a 2700x computer and it took forever to process.
vcovHC(test, method = "arellano", cluster = "time")