I have been trying to upload many files into R using several different methods that have worked with me in the past, but for some reason are not here. I have read many posts on the forum that address the different ways this can be done but none of them seem to work for my problem; the files are larger I suppose.
Here are the different things I have tried:
files <- list.files(pattern = ".txt")
listOfFiles <- list()
for(i in 1:length(files)){
listOfFiles[[i]] <- read.table(files[i], header = TRUE, sep = "\t", stringsAsFactors = FALSE)
}
However, when I run this, my computer just freezes and ceases to work. This has led me to believe that it may be a memory issue however, I have tried changing the memory.limit()
to about 12000 and it still does not run.
There is a posting here that sort of addresses the issue at hand: Quickly reading very large tables as dataframes. Reasons why it differs is that I know that the scripts I have uploaded work, just not on many files totaling more than 2GB. I believe this is a memory issue because, when I ran it again I got the error:
Error: cannot allocate vector of size 7.8 Mb
I have read other posts on the forum that use lapply
, so thought I'd try it out however, it has also failed to work.
Here is what I did:
listo <- lapply(files, read.table)
This on the other hand runs, but when I try and open the list listo
it gives me the error:
Error: object 'listo' not found
Any help would be much appreciated.