0

I am reading in 15 files that are located in a folder on my desktop. I can read in the first few files no problem but when i get a little more into it I end up getting a memory error that looks to be caused by Pythons built in iterator. Just a note, I am using multiple lists to hold all the data because I am using 32bit python and they fill up very quickly since the files are so large.

x=0 #represents the file being read
for file in glob.glob(path): #reads all files in a folder
    with open(file) as f: 
        print (f)
        for line in f:    
            if "jack" in line: 

                if(x<=3):#prevent memory error
                    jack1.append(line)

                elif(x>3 and x<=6):
                    jack2.append(line)

                elif(x>6 and x<=8):
                    jack3.append(line)

                elif(x>8 and x<=9):#prevent memory error with try except 
                    try:
                        jack4.append(line)

                    except:
                        jack5.append(line)

Traceback (most recent call last):
  File "C:\Users\erik.kniaz\workspace\arif help\jack.py", line 102, in <module>
for line in f:    
MemoryError
wallyk
  • 56,922
  • 16
  • 83
  • 148
tylerik
  • 97
  • 2
  • 10

1 Answers1

1

I am using 32bit python

There's your problem. 32-bit processes, generally speaking, are limited to 4 GB of virtual memory each, minus kernel space, which can be significant. You will need to either switch to 64-bit computing or redesign your program to consume less memory. This is normally done by pushing data back out to the filesystem instead of keeping it in memory.

Community
  • 1
  • 1
Kevin
  • 28,963
  • 9
  • 62
  • 81