I'm trying to work with a 1909x139352 dataset using R. Since my computer only has 2GB of RAM, the dataset turns out to be too big (500MB) for the conventional methods. So I decided to use the ff
package. However, I've been having some troubles. The function read.table.ffdf
is unable to read the first chunk of data. It crashes with the next error:
txtdata <- read.table.ffdf(file="/directory/myfile.csv",
FUN="read.table",
header=FALSE,
sep=",",
colClasses=c("factor",rep("integer",139351)),
first.rows=100, next.rows=100,
VERBOSE=TRUE)
read.table.ffdf 1..100 (100) csv-read=77.253sec
Error en ff(initdata = initdata, length = length, levels = levels, ordered = ordered, :
write error
Does anyone have any idea of what is going on?