I'm dealing with a memory issue when searching a folder with milions of files. Does anyone know how to overcome this situation? Is there some way to put a limit of files that glob will search? So it could be executed in chunks?
Traceback (most recent call last):
File "./lb2_lmanager", line 533, in <module>
main(sys.argv[1:])
File "./lb2_lmanager", line 318, in main
matched = match_files(policy.directory, policy.file_patterns)
File "./lb2_lmanager", line 32, in wrapper
res = func(*args, **kwargs)
File "./lb2_lmanager", line 380, in match_files
listing = glob.glob(directory)
File "/usr/lib/python2.6/glob.py", line 16, in glob
return list(iglob(pathname))
File "/usr/lib/python2.6/glob.py", line 43, in iglob
yield os.path.join(dirname, name)
File "/usr/lib/python2.6/posixpath.py", line 70, in join
path += '/' + b
MemoryError