10

I have a set of system tests which fire up some processes, create files etc., then shut them all down and delete the files.

I am encountering two intermittent errors on the cleanup:

On a log file created by one of the processes:

    os.remove(log_path)
WindowsError: [Error 32] The process cannot access the file because it is being used by another process: <path_to_file>

When trying to delete the output directory with shutil.rmtree:

File "C:\Python27\lib\shutil.py", line 254, in rmtree
    os.rmdir(path)
WindowsError: [Error 145] The directory is not empty: 'C:\\TestTarget\\xxx'

Both errors go away if I insert a 2 second delay before the tidyup, so I think the problem is with the time Windows takes to release the files. Obviously I'd like to avoid putting in delays in my tests, is there a way to wait until the filesystem has caught up?

Stefan
  • 8,819
  • 10
  • 42
  • 68
  • 1
    You could stick a `try` block in a loop and loop until it succeeds...will that work? – Tyler MacDonell Feb 11 '14 at 09:36
  • That's a reasonable stopgap, thanks. I would like a cleaner solution though if there is one. – Stefan Feb 11 '14 at 09:45
  • I think as long as the process closes the file correctly when it is shut down, you should be able to immediately delete it. You might look for the real cause of the problem. :-) – realtime Feb 11 '14 at 12:15
  • I've experienced similar problems. I think it's either a problem with the anti-virus software or a bug in NTFS. In my experience it usually resolves very quickly, so the simplest workaround is to detect the failure, delay briefly (maybe 10ms) and try again in a loop. – Harry Johnston Feb 11 '14 at 20:55
  • Same problem here. I am deleting a directory shutil.rmtree(), then renaming another directory the same one using os.rename(), and getting the error "Cannot create a file when that file already exists". A little ridiculous in my opinion, I'm trying to use a python as cross-platform shell/batch replacement, and I've definitely never seen this type of issue in a shell or batch script. – John Oct 10 '14 at 18:57
  • According to [many](https://stackoverflow.com/questions/876473/is-there-a-way-to-check-if-a-file-is-in-use) [other](https://stackoverflow.com/questions/1406808/wait-for-file-to-be-freed-by-process) [similar](https://stackoverflow.com/questions/1746781/waiting-until-a-file-is-available-for-reading-with-win32) questions there is no convient way to detect if file is already in use so you better go with `try ... except` loops. – viilpe Nov 30 '20 at 13:32
  • Also you may try to enumerate all handles in all processes see links [here](https://stackoverflow.com/questions/183925/what-win32-api-can-be-used-to-find-the-process-that-has-a-given-file-open) but it is not so easy and I guess it will require admin rights to read all processes. – viilpe Nov 30 '20 at 13:32

2 Answers2

0

I had similar problem I search for proper solution for months but found none. For me the problem only occurred while running my script on windows with python2.7. On python3 most of the times there where no problem. On GNU/Linux I could use the file operations without this dirty solution.

I ended up using this functions for any files operation for windows: try_fail_wait_repeat (see below), you should do something similar. Also you can set the sleep to a different value.

import sys
import shutil
import time
import os

IS_WINDOWS = (sys.platform == "win32")


if IS_WINDOWS:
    maximum_number_of_tries = 40
    def move_folder(src, dst): 
        return try_fail_wait_repeat(maximum_number_of_tries, _move_dir, src, dst)
    def read_file(path): 
        return try_fail_wait_repeat(maximum_number_of_tries, _read_file, path)
else:
    def move_folder(src, dst):
        return shutil.move(src, dst)
    
    def read_file(path):
        return _read_file(path)


def _read_file(file_path):
    with open(file_path, "rb") as f_in:
        data = f_in.read().decode("ISO-8859-1")
    return data


def try_fail_wait_repeat(maximum_number_of_tries, func, *args):
    """A dirty solution for a dirty bug in windows python2"""
    i = 0
    while True:
        try:
            res = func(*list(args))
            return res
        except WindowsError as e:
            i += 1
            time.sleep(0.5)
            if i > maximum_number_of_tries:
                print("Too much trying to run {}({})".format(func, args))
                raise e
Károly Szabó
  • 1,131
  • 9
  • 17
-3

The function you are using only deletes empty directories

Try with:

import shutil
shutil.rmtree('/folder_path')

Also, try adding a sleep interval before you shut down the proccesses.

Rosenthal
  • 149
  • 1
  • 2
  • 11
  • I found out that rmtree calls os.rmdir(path) and which is still giving the same error. ```clear_directory() in clear_directory shutil.rmtree(download_dir) File "C:\python27_x64\lib\shutil.py", line 247, in rmtree rmtree(fullname, ignore_errors, onerror) File "C:\python27_x64\lib\shutil.py", line 256, in rmtree onerror(os.rmdir, path, sys.exc_info()) File "C:\python27_x64\lib\shutil.py", line 254, in rmtree os.rmdir(path) WindowsError: [Error 145] The directory is not empty: 'D:\\downloads\\test' ''' – Gayan Pathirage Oct 22 '17 at 09:06