Having read explanation on this link: https://stackoverflow.com/questions/1727772/quickly-reading-very-large-tables-as-dataframes ,I tried to read a large csv file containing 8 million rows and 26 columns with library feather and other libraries vroom, data.table etc. However, it gives me the same error:
file = "file.feather"
write_feather(read_df, file)
new_df = read_feather(file)
ERROR: Cannot allocate vector of size 61.9 Mb
I browsed around thereafter and found out that in such cases the memory size of R should be increased using memory.size(max = ....Mb). It did not resolve the issue, either.
Can someone explaine to me how to resolve such issues?
Any feedback is appreciated.