0

I'm trying to read a 10Gb csv file, and have tried on Mac and Windows but am getting the error "vector memory exhausted" on Mac and "cannot allocate vector of size 500.0 Mb" on Windows.

The data I'm working with is the "US - Fixed with Satellite - Jun 17" from https://www.fcc.gov/general/broadband-deployment-data-fcc-form-477

I know this question has been asked before, but I still haven't been able to get the csv file to successfully read.

Would someone be able to try? Also let me know if more details about the circumstances are needed.

joat1
  • 53
  • 2
  • 6
  • How much RAM do your machines have? If it's less than 10GB, there is no way you'll be able to read in the full file in one go. Even with 16GB+ RAM there is no guarantee that directly reading the file with `read.csv` will work. You probably need to look into ways to read only part of the data at a time, or tools that allow you to keep the data on disk. – Marius Oct 22 '18 at 05:14
  • 2
    If you can't load the entire thing into R, one alternative might be to load it into a database. From there, you may export a subsample which could fit in R directly. – Tim Biegeleisen Oct 22 '18 at 05:16
  • My MAC is a 16GB, the Windows is even less. I'll look into and try both of those suggestions. Thanks for the guidance! – joat1 Oct 23 '18 at 14:49
  • Possible duplicate of [R memory management / cannot allocate vector of size n Mb](https://stackoverflow.com/questions/5171593/r-memory-management-cannot-allocate-vector-of-size-n-mb) – Benjamin Jan 20 '19 at 14:51

0 Answers0