13

I have insane big directory. I need to get filelist via python.

In code i need to get iterator, not list. So this not work:

os.listdir
glob.glob  (uses listdir!)
os.walk

I cant find any good lib. help! Maybe c++ lib?

fun_vit
  • 384
  • 1
  • 3
  • 12
  • Looks like a duplicate of *[Is there a way to efficiently yield every file in a directory containing millions of files?](http://stackoverflow.com/q/5090418/151299)*. – Oben Sonne Feb 25 '11 at 11:48
  • oh, yes. cant find that post by search... – fun_vit Feb 25 '11 at 11:57

9 Answers9

14

for python 2.X

import scandir
scandir.walk()

for python 3.5+

os.scandir()

https://www.python.org/dev/peps/pep-0471/

https://pypi.python.org/pypi/scandir

miguelfg
  • 1,455
  • 2
  • 16
  • 21
9

If you have a directory that is too big for libc readdir() to read it quickly, you probably want to look at the kernel call getdents() (http://www.kernel.org/doc/man-pages/online/pages/man2/getdents.2.html ). I ran into a similar problem and wrote a long blog post about it.

http://www.olark.com/spw/2011/08/you-can-list-a-directory-with-8-million-files-but-not-with-ls/

Basically, readdir() only reads 32K of directory entries at a time, and so if you have a lot of files in a directory, readdir() will take a very long time to complete.

Ben
  • 2,296
  • 1
  • 15
  • 3
1

I found this library useful: https://github.com/benhoyt/scandir.

Mikhail Korobov
  • 21,908
  • 8
  • 73
  • 65
0

You should use generator. This problem is discussed here: http://bugs.python.org/issue11406

socketpair
  • 1,893
  • 17
  • 15
0

i think that using opendir would work and there is a python package: http://pypi.python.org/pypi/opendir/0.0.1 that wraps it via pyrex

Dan D.
  • 73,243
  • 15
  • 104
  • 123
  • sounds nice, but cant install under windows... File "c:\python26\lib\site-packages\pyrex-0.9.9-py2.6.egg\Pyrex\Distutils\extension.py", line 69, in __init__ **kw) TypeError: unbound method __init__() must be called with Extension instance as first argument (got Extension instance instead) – fun_vit Feb 25 '11 at 11:47
0

Someone built a python module off that article that wraps getdents. Btw, I know this post is old, but you could use scandir (and I have done that with dirs with 21 million files). Walk is way too slow though it is also a generator but too much overhead.

This module seems like it would have been an interesting alternative. Have not used it, but he did base it off 8 million files LS article referenced above. Reading through the code, thinking this would have been fun and faster to use.

Also allows you to tweak the buffer without having to go into C directly.

https://github.com/ZipFile/python-getdents And via pip and pypi though I recommend reading the docs.

https://pypi.org/project/getdents/

0

I found this library really fast.
https://pypi.org/project/scandir/
I used below code from this library, it worked like a charm.

def subdirs(path):
"""Yield directory names not starting with '.' under given path."""
for entry in os.scandir(path):
    if not entry.name.startswith('.') and entry.is_dir():
        yield entry.name
chanduthedev
  • 356
  • 2
  • 9
-1

http://docs.python.org/release/2.6.5/library/os.html#os.walk

>>> import os
>>> type(os.walk('/'))
<type 'generator'>
Trey Stout
  • 6,231
  • 3
  • 24
  • 27
  • 3
    unfortunately [os.walk uses `listdir` internally](http://hg.python.org/cpython/file/29f0836c0456/Lib/os.py#l276). – quodlibetor Apr 08 '13 at 20:44
-2

How about glob.iglob? It's the iterator glob.

Dane White
  • 3,443
  • 18
  • 16