I'm trying to load ~2GB of text files (approx 35K files) in my python script. I'm getting a memory error around a third of the way through on page.read(). I'
for f in files:
page = open(f)
pageContent = page.read().replace('\n', '')
page.close()
cFile_list.append(pageContent)
I've never dealt with objects or processes of this size in python. I checked some of other Python MemoryError related threads but I couldn't get anything to fix my scenario. Hopefully there is something out there that can help me out.