I am wanting to clip a large shapefile (67MB) in program R and derive a much smaller raster from around ~5% of it. Once loaded the shapefile has 221388 features and 5 fields - and explodes to 746 MB.
My difficulty comes when trying to clip the file to a workable size - the program crashes after a few minutes. I have tried both crop (from raster) and gIntersection (from rgeos) without success. I have 8GB of RAM - clearly there is a memory issue.
I am guessing there maybe a work around. I know that there are some big memory packages out there - but can any of them help in my kind of situation? My current code is below:
# dataset can be found at
# http://data.fao.org/map?entryId=271096b2-8a12-4050-9ff2-27c0fec16c8f
# location of files
ogrListLayers("C:/Users/Me/Documents/PNG Glob")
# import shapefile
ogrDrivers()[10,]
# shapefiles
Glob<-readOGR("C:/Users/Me/Documents/PNG Glob", layer="png_gc_adg_1")
# assign projection
Glob@proj4string<- CRS("+proj=longlat")
#object size
object.size(Glob)
# clipping
crop(Glob, extent(c(144,146,-7,-5)))