0

Having read explanation on this link: https://stackoverflow.com/questions/1727772/quickly-reading-very-large-tables-as-dataframes ,I tried to read a large csv file containing 8 million rows and 26 columns with library feather and other libraries vroom, data.table etc. However, it gives me the same error:

file = "file.feather"
write_feather(read_df, file)
new_df = read_feather(file)

ERROR: Cannot allocate vector of size 61.9 Mb

I browsed around thereafter and found out that in such cases the memory size of R should be increased using memory.size(max = ....Mb). It did not resolve the issue, either.

Can someone explaine to me how to resolve such issues?

Any feedback is appreciated.

AKA
  • 11
  • 4
  • 1
    See if this helps, https://stackoverflow.com/questions/39678940/how-to-deal-with-a-50gb-large-csv-file-in-r-language https://stackoverflow.com/questions/38536226/r-reading-a-huge-csv https://stackoverflow.com/questions/45594639/how-do-i-import-a-large-6-gb-csv-file-into-r-efficiently-and-quickly-without – Nad Pat Oct 01 '21 at 17:14
  • Thanks! I am going to check them all. – AKA Oct 01 '21 at 17:33

0 Answers0