I have a large "big.matrix" and I need to remove a few columns from it. It was created from a CSV file (with 72 million rows) using
BigMat <- read.big.matrix("matrix.csv", type="double", header=TRUE,
backingfile="matrix.bin",
descriptorfile="matrix.desc")
This successfully loads the matrix into R but I do not have enough memory space to create a new object when trying to subset this matrix:
BigMatSub <- BigMat[, 5:71]
It gave me: Error: cannot allocate vector of size 37.6 Gb.
Is there any way of the removing columns while without hitting memory limit? I need to have it as "big.matrix" object in the end to use in biglasso()
.
The matrix is sparse with many zero values.
Any help is much appreciated.