I have a large data file that is 6.1 GB on my iMac (OS: Catalina 10.15.4) Processor (3.1 GHz) I have tried multiple ways to read in the file into my R global environment.
library(foreign)
data <- read.dta(file = "File.dta", missing.type = TRUE)
install.packages("readstata13")
library(readstata13)
data <- read.dta13(file = "File.dta")
library(haven)
data <- read_dta('File.dta')
library(memisc)
data <- as.data.frame(file = "File.dta")
Each way I get an error: Error: vector memory exhausted (limit reached?)
I have tried to address this using the following codes to increase the memory I have used:
memory.limit(size = 12000) #This is a Windows only command
Sys.setenv('R_MAX_VSIZE'=32000000000)
options(scipen = 999)
But none of this has worked.
Has anyone had this problem with a Mac and been able to fix this?