55

I want to quickly find the total size of any folder using python.

import os
from os.path import join, getsize, isfile, isdir, splitext
def GetFolderSize(path):
    TotalSize = 0
    for item in os.walk(path):
        for file in item[2]:
            try:
                TotalSize = TotalSize + getsize(join(item[0], file))
            except:
                print("error with file:  " + join(item[0], file))
    return TotalSize

print(float(GetFolderSize("C:\\")) /1024 /1024 /1024)

That's the simple script I wrote to get the total size of the folder, it took around 60 seconds (+-5 seconds). By using multiprocessing I got it down to 23 seconds on a quad core machine.

Using the Windows file explorer it takes only ~3 seconds (Right click-> properties to see for yourself). So is there a faster way of finding the total size of a folder close to the speed that windows can do it?

Windows 7, python 2.6 (Did searches but most of the time people used a very similar method to my own) Thanks in advance.

AnneTheAgile
  • 9,932
  • 6
  • 52
  • 48
user202459
  • 631
  • 1
  • 6
  • 10
  • The code as presented is invalid. Could you post a complete, minimal example that you've actually run? – bignose Mar 21 '10 at 03:21
  • Sorry, only had the function before, the rest of it is edited in. – user202459 Mar 21 '10 at 03:37
  • related: [Calculating a directory size using Python?](http://stackoverflow.com/questions/1392413/calculating-a-directory-size-using-python) – jfs Nov 25 '16 at 23:22

3 Answers3

83

You are at a disadvantage.

Windows Explorer almost certainly uses FindFirstFile/FindNextFile to both traverse the directory structure and collect size information (through lpFindFileData) in one pass, making what is essentially a single system call per file.

Python is unfortunately not your friend in this case. Thus,

  1. os.walk first calls os.listdir (which internally calls FindFirstFile/FindNextFile)
    • any additional system calls made from this point onward can only make you slower than Windows Explorer
  2. os.walk then calls isdir for each file returned by os.listdir (which internally calls GetFileAttributesEx -- or, prior to Win2k, a GetFileAttributes+FindFirstFile combo) to redetermine whether to recurse or not
  3. os.walk and os.listdir will perform additional memory allocation, string and array operations etc. to fill out their return value
  4. you then call getsize for each file returned by os.walk (which again calls GetFileAttributesEx)

That is 3x more system calls per file than Windows Explorer, plus memory allocation and manipulation overhead.

You can either use Anurag's solution, or try to call FindFirstFile/FindNextFile directly and recursively (which should be comparable to the performance of a cygwin or other win32 port du -s some_directory.)

Refer to os.py for the implementation of os.walk, posixmodule.c for the implementation of listdir and win32_stat (invoked by both isdir and getsize.)

Note that Python's os.walk is suboptimal on all platforms (Windows and *nices), up to and including Python3.1. On both Windows and *nices os.walk could achieve traversal in a single pass without calling isdir since both FindFirst/FindNext (Windows) and opendir/readdir (*nix) already return file type via lpFindFileData->dwFileAttributes (Windows) and dirent::d_type (*nix).

Perhaps counterintuitively, on most modern configurations (e.g. Win7 and NTFS, and even some SMB implementations) GetFileAttributesEx is twice as slow as FindFirstFile of a single file (possibly even slower than iterating over a directory with FindNextFile.)

Update: Python 3.5 includes the new PEP 471 os.scandir() function that solves this problem by returning file attributes along with the filename. This new function is used to speed up the built-in os.walk() (on both Windows and Linux). You can use the scandir module on PyPI to get this behavior for older Python versions, including 2.x.

Ben Hoyt
  • 10,694
  • 5
  • 60
  • 84
vladr
  • 65,483
  • 18
  • 129
  • 130
  • 1
    The mentioned PEP page contains an example for exactly this purpose: https://www.python.org/dev/peps/pep-0471/#examples – Hossein Apr 04 '16 at 11:12
22

If you want same speed as explorer, why not use the windows scripting to access same functionality using pythoncom e.g.

import win32com.client as com

folderPath = r"D:\Software\Downloads"
fso = com.Dispatch("Scripting.FileSystemObject")
folder = fso.GetFolder(folderPath)
MB = 1024 * 1024.0
print("%.2f MB" % (folder.Size / MB))

It will work same as explorer, you can read more about Scripting runtime at http://msdn.microsoft.com/en-us/library/bstcxhf7(VS.85).aspx.

Augustin
  • 2,444
  • 23
  • 24
Anurag Uniyal
  • 85,954
  • 40
  • 175
  • 219
  • 2
    That works great, amazing actually. But only most of the time. In a directory ('C:\Downloads') with a size of 37GB and 7 000 files your method get's the result almost instantaneously. The os.walk() way get's the result back in a couple of seconds (3 seconds) But I have some problems on other directories such as C:\Windows, C:\users etc. where it says an exception occurred. – user202459 Mar 21 '10 at 03:56
  • 1
    @freakazo, C:\Windows worked on my machine, what error do you get? – Anurag Uniyal Mar 21 '10 at 04:37
  • 1
    Traceback (most recent call last): File "Test.py", line 7, in print "%.2f MB"%(folder.Size/MB) File "C:\python26_32\lib\site-packages\win32com\client\dynamic.py", line 501, in __getattr__ ret = self._oleobj_.Invoke(retEntry.dispid,0,invoke_type,1) pywintypes.com_error: (-2147352567, 'Exception occurred.', (0, None, None, None, 0, -2146828218), None) Press any key to continue . . . ### A couple more tests showed that it is folder.size that's giving the problem. folder.name for example works on the C:\Windows directory – user202459 Mar 21 '10 at 05:45
  • This is long dead of course but I was getting the same thing on c:\users\myname and looking it up, it was permission denied. It works on anything you've created yourself but anything system-y appears a no go, even running the script as Administrator. – jambox Jul 27 '12 at 09:15
5

I compared the performance of the Python code against a 15k directory tree containing 190k files and compared it against the du(1) command which presumably goes about as fast as the OS. The Python code took 3.3 seconds compared to du which took 0.8 seconds. This was on Linux.

I'm not sure there is much to squeeze out of the Python code. Note too that the first run of du took 45 seconds which was obviously before the relevant i-nodes were in the block cache; therefore this performance is heavily dependent upon how well the system is managing its store. It wouldn't surprise me if either or both:

  1. os.path.getsize is sub-optimal on Windows
  2. Windows caches directory contents size once calculated
msw
  • 42,753
  • 9
  • 87
  • 112
  • 2
    It looks like it is indeed slower on windows, on windows with a 23K directory tree and 175K files it took around 60 seconds. Using the du windows equivalent it took 6 seconds to complete. So it looks like Python is 10x slower on windows than du and 4 times slower on linux. So yip it seems that 1. os.path.getsize/os.walk is indeed sub-optimal on windows 2. Windows does seem to cache directory contents size 3. Windows still is just slower than linux – user202459 Mar 21 '10 at 04:14