I would argue that this is possible, but would require some parallel processing capabilities. Each worker would load the .RData file and output the desired object. Merging the result would probably be pretty straightforward.
I can't provide code for your data because I don't know the structure, but I would do something along the lines of the below chunk'o'code. Note that I'm on Windows and your workflow may differ. You should not be short on computer memory. Also, snowfall is not the only interface to use multiple cores.
# load library snowfall and set up working directory
# to where the RData files are
library(snowfall)
working.dir <- "/path/to/dir/with/files"
setwd(working.dir)
# initiate (redneck jargon: and then she ate) workers and export
# working directory. Working directory could be hard coded into
# the function, rendering this step moot
sfInit(parallel = TRUE, cpus = 4, type = "SOCK")
sfExport(list = c("working.dir")) # you need to export all variables but x
# read filenames and step through each, returning only the
# desired object
lofs <- list.files(pattern = ".RData")
inres <- sfSapply(x = lofs, fun = function(x, wd = working.dir) {
setwd(wd)
load(x)
return(Dataset_of_interest)
}, simplify = FALSE)
sfStop()
# you could post-process the data by rbinding, cbinding, cing...
result <- do.call("rbind", inres)