419

I'm using .NET 3.5, trying to recursively delete a directory using:

Directory.Delete(myPath, true);

My understanding is that this should throw if files are in use or there is a permissions problem, but otherwise it should delete the directory and all of its contents.

However, I occasionally get this:

System.IO.IOException: The directory is not empty.
    at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
    at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive)
    at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive)
    ...

I'm not surprised that the method sometimes throws, but I'm surprised to get this particular message when recursive is true. (I know the directory is not empty.)

Is there a reason I'd see this instead of AccessViolationException?

apxcode
  • 7,696
  • 7
  • 30
  • 41
Jason Anderson
  • 9,003
  • 5
  • 30
  • 25
  • 15
    You wouldn't see AccessViolationException -- that's for invalid pointer operations, not for disk access. – Joe White May 28 '09 at 20:31
  • 1
    This does seem to be some sort of IO issue other than just the directory not being empty, like open file handles or something. I'd try using the recursive delete option, then in a catch for IOException, search for and close any open file handles, then retry. There's a discussion about that over here: http://stackoverflow.com/questions/177146/how-do-i-get-the-list-of-open-file-handles-by-process-in-c – Dan Csharpster Feb 05 '15 at 19:06

29 Answers29

246

Editor's note: Although this answer contains some useful information, it is factually incorrect about the workings of Directory.Delete. Please read the comments for this answer, and other answers to this question.


I ran into this problem before.

The root of the problem is that this function does not delete files that are within the directory structure. So what you'll need to do is create a function that deletes all the files within the directory structure then all the directories before removing the directory itself. I know this goes against the second parameter but it's a much safer approach. In addition, you will probably want to remove READ-ONLY access attributes from the files right before you delete them. Otherwise that will raise an exception.

Just slap this code into your project.

public static void DeleteDirectory(string target_dir)
{
    string[] files = Directory.GetFiles(target_dir);
    string[] dirs = Directory.GetDirectories(target_dir);

    foreach (string file in files)
    {
        File.SetAttributes(file, FileAttributes.Normal);
        File.Delete(file);
    }

    foreach (string dir in dirs)
    {
        DeleteDirectory(dir);
    }

    Directory.Delete(target_dir, false);
}

Also, for me I personally add a restriction on areas of the machine that are allowed to be deleted because do you want someone to call this function on C:\WINDOWS (%WinDir%) or C:\.

user247702
  • 23,641
  • 15
  • 110
  • 157
Jeremy Edwards
  • 14,620
  • 17
  • 74
  • 99
  • 4
    Seriously? Seems like a no-brainer .NET extension. Why would the framework designers choose to omit this? – Mike Caron Dec 28 '09 at 21:03
  • 1
    Thanks by the way, this was a helpful copy/paste. – Mike Caron Dec 28 '09 at 21:06
  • 132
    This is non sense. Directory.Delete(myPath, true) is an overload that delete all files that are within the directory structure. If you wanna get wrong, get wrong with Ryan S answer. – Sig. Tolleranza Feb 10 '10 at 09:00
  • I had the same excpetion, and it does seem lije Ryan S. naswer's is more ligely to be related with the problem. – pauloya Mar 04 '10 at 15:19
  • 38
    +1 because although Directory.Delete() does delete files inside its subdirectories (with recursive = true), it throws an "IOException : Directory is not empty" if one of the sub-directories or files is read-only. So this solution works better than Directory.Delete() – Anthony Brien May 02 '10 at 05:19
  • 1
    We have a product deployed on many different environments. Recently I had another one of these exceptions on a different module. I can see that there are no read-only files on the directory. Again I point to Ryan S. solution. – pauloya Oct 19 '10 at 07:35
  • 18
    Your statement that `Directory.Delete(path, true)` does not delete files is wrong. See MSDN http://msdn.microsoft.com/en-us/library/fxeahc5f.aspx – Konstantin Spirin Apr 13 '11 at 08:04
  • 21
    -1 Can someone please put a clear marker that the validity of this approach is very much in doubt. If `Directory.Delete(string,bool)` fails, something is locked or mispermissioned and there is no one size fits all solution to such a problem. People need to address that issue in their context and we shouddnt be growing a big hairy throw every idea at the problem (with retries and exception swallowing) and hoping for a good outcome. – Ruben Bartelink Feb 08 '12 at 08:48
  • 4
    won't work for me. still get "The directory is not empty" exception if some of the sub-folders open in explorer. – mt_serg Apr 09 '12 at 12:53
  • 1
    @mt_serg You'll get that result if you try doing the delete in Explorer itself in that case. AFAIK you need to use some sort of kernel level kludge to forcibly close/break another applications file handle in order to delete the file. – Dan Is Fiddling By Firelight Apr 13 '12 at 13:13
  • 37
    Beware of this approach if your directory your deleting has shortcuts/symbolic links to other folders - you may end up deleting more then you expected – Chanakya May 30 '12 at 14:34
  • 2
    Yes, @Chanakya is correct. `Directory.Delete(path, true)` does not delete files/folders pointed to by symbolic links. His suggested method does! To avoid this call `File.GetAttributes` and check for `FileAttributes.ReparsePoint` bit. If it's set, make sure to remove it from file/folder attributes before deleting it. Also in case of a folder, do not call own function on a symbolic link recursively. Instead simply remove it with `Directory.Delete(dir, false)` and it should work. – ahmd0 Aug 10 '13 at 22:28
  • 4
    This answer helped get get around the read-only exceptions. Even though it's lame that Directory.Delete doesn't have an overload that does it, at least this code snippet does! Thanks, Jeremy Edwards! – kayleeFrye_onDeck Jan 15 '15 at 20:04
  • This solution sometimes doesn't work. After using this method some directory sometimes still extists. – Rayet Jan 21 '15 at 09:12
  • Worked just fine for me, except there are limitations. Fails when: 1) a file from any directory/sub-directory is open or locked, 2) the folder or any sub-folder is open. You have to account for these factors in any rendition of this code you use. #1: If the file is an app (.exe), or runs an application (i.e. is a Text file, so runs Notepad), you can run WMI code to end that process on that target PC, or do `Process.Kill()` if on the local PC, & add a delay/retry for locks by AV, Backup software, etc. #2: You can terminate `explorer.exe` through WMI code remotely or `Process.Kill()` locally. – vapcguy Jan 24 '17 at 19:38
  • 5
    Four of the above comments, all from 2010, refer to an answer by "Ryan S.". Some time during the last 8 years Ryan S. has apparently morphed into @ryascl, aka Ryan Smith. – RenniePet Apr 24 '18 at 05:16
  • I was getting the same exception and I changed the last line from Directory.Delete(target_dir, false); to Directory.Delete(target_dir, **true**); and it works fine. – DGaleano May 07 '19 at 16:34
  • 2
    His current username is @rpisryan. The answer: https://stackoverflow.com/a/1703799/2724588 – Gerco Brandwijk Feb 04 '21 at 10:57
197

If you are trying to recursively delete directory a and directory a\b is open in Explorer, b will be deleted but you will get the error 'directory is not empty' for a even though it is empty when you go and look. The current directory of any application (including Explorer) retains a handle to the directory. When you call Directory.Delete(true), it deletes from bottom up: b, then a. If b is open in Explorer, Explorer will detect the deletion of b, change directory upwards cd .. and clean up open handles. Since the file system operates asynchronously, the Directory.Delete operation fails due to conflicts with Explorer.

Incomplete solution

I originally posted the following solution, with the idea of interrupting the current thread to allow Explorer time to release the directory handle.

// incomplete!
try
{
    Directory.Delete(path, true);
}
catch (IOException)
{
    Thread.Sleep(0);
    Directory.Delete(path, true);
}

But this only works if the open directory is the immediate child of the directory you are deleting. If a\b\c\d is open in Explorer and you use this on a, this technique will fail after deleting d and c.

A somewhat better solution

This method will handle deletion of a deep directory structure even if one of the lower-level directories is open in Explorer.

/// <summary>
/// Depth-first recursive delete, with handling for descendant 
/// directories open in Windows Explorer.
/// </summary>
public static void DeleteDirectory(string path)
{
    foreach (string directory in Directory.GetDirectories(path))
    {
        DeleteDirectory(directory);
    }

    try
    {
        Directory.Delete(path, true);
    }
    catch (IOException) 
    {
        Directory.Delete(path, true);
    }
    catch (UnauthorizedAccessException)
    {
        Directory.Delete(path, true);
    }
}

Despite the extra work of recursing on our own, we still have to handle the UnauthorizedAccessException that can occur along the way. It's not clear whether the first deletion attempt is paving the way for the second, successful one, or if it's merely the timing delay introduced by the throwing/catching an exception that allows the file system to catch up.

You might be able to reduce the number of exceptions thrown and caught under typical conditions by adding a Thread.Sleep(0) at the beginning of the try block. Additionally, there is a risk that under heavy system load, you could fly through both of the Directory.Delete attempts and fail. Consider this solution a starting point for more robust recursive deletion.

General answer

This solution only addresses the peculiarities of interacting with Windows Explorer. If you want a rock-solid delete operation, one thing to keep in mind is that anything (virus scanner, whatever) could have an open handle to what you are trying to delete, at any time. So you have to try again later. How much later, and how many times you try, depends on how important it is that the object be deleted. As MSDN indicates,

Robust file iteration code must take into account many complexities of the file system.

This innocent statement, supplied with only a link to the NTFS reference documentation, ought to make your hairs stand up.

(Edit: A lot. This answer originally only had the first, incomplete solution.)

Glorfindel
  • 21,988
  • 13
  • 81
  • 109
rpggio
  • 2,447
  • 1
  • 19
  • 13
  • 11
    It does appear calling Directory.Delete(path, true) while path or one of the folders/files under path is open or selected in Windows Explorer will throw an IOException. Closing Windows Explorer and rerunning my existing code w/o the try/catch suggested above worked fine. – David Alpert Feb 24 '10 at 19:29
  • I think this may happen even if explorer is not open on the directory. I think it can be related with anti-virus accessing it. In my case the machine had Norton. – pauloya Oct 19 '10 at 07:36
  • 1
    I cannot fathom how and why it works but it worked for me while setting file attributes and writing my own recursive function didn't. – Stilgar Dec 16 '10 at 10:27
  • 1
    @CarlosLiu Because it is giving "Explorer a chance to release the directory handle" – Dmitry Gonchar May 08 '13 at 14:04
  • 4
    What is happening is that the system asks Explorer to "release the directory handle", then attempts to delete the directory. If the directory handle was not deleted in time, an exception is raised and the `catch` block is executed (meanwhile, Explorer is still releasing the directory, as no command has been sent to tell it not to do so). The call to `Thread.Sleep(0)` may or may not be necessary, as the `catch` block has already given the system a bit more time, but it does provide a little extra safety for a low cost. After that, the `Delete` is called, with the directory already released. – Zachary Kniebel Jun 21 '13 at 13:26
  • Functions related with file existence always rely on non-standard methods, there is no way otherwise. – prabhakaran Nov 05 '13 at 08:07
  • Would it be worth adding a slightly higher sleep time (eg Thread.Sleep(100) - just for laughs - and maybe for faster computers? I can't reproduce the issue to confirm this idea would be of any benefit, but can anybody confirm (or guess) if this is a reasonable bet? – PandaWood Nov 13 '13 at 02:45
  • 1
    @PandaWood actually only this Sleep(100) worked for me. Sleep(0) didn't work. I have no idea what is going on and how to solve this properly. I mean, what if it depends on server load and in future there should be 300 or 400? How to know that. Must be another proper way... – Roman Nov 21 '13 at 11:11
  • 1
    @Roman thanks, interesting. I used 100 too. Something that's locked must be in the process of unlocking, but hasn't finished, and that's what we're waiting for. I guess this is like a failing network connection, you can only try for so long, or so many times, at some point your code may have to give up. – PandaWood Nov 21 '13 at 23:05
  • @Roman - Idea: Go all the way through the recursion. Return a boolean False if there was a problem at any point. Pass that False all the way to the top level method. At this point, we will have deleted everything that can be easily deleted, and we will have told system to ATTEMPT to delete everything, so Explorer is hopefully starting to release its handles. Now Sleep(500) to give abundant time for any release to occur. Then repeat the entire recursion. This way, we do one long Sleep, rather than repeated Sleeps throughout the levels of recursion. If still False, could sleep again even longer. – ToolmakerSteve Apr 30 '18 at 12:02
  • Note that you might also see this problem if you forget to dispose a file-watcher, watching content in one of the folders/files you are trying to delete - which is what I forgot to do... – Spiralis Jun 24 '18 at 22:28
48

Before going further, check for the following reasons that are under your control:

  • Is the folder set as a current directory of your process? If yes, change it to something else first.
  • Have you opened a file (or loaded a DLL) from that folder? (and forgot to close/unload it)

Otherwise, check for the following legitimate reasons outside of your control:

  • There are files marked as read-only in that folder.
  • You don't have a deletion permission to some of those files.
  • The file or subfolder is open in Explorer or another app.

If any of the above is the problem, you should understand why it happens before trying to improve your deletion code. Should your app be deleting read-only or inaccessible files? Who marked them that way, and why?

Once you have ruled out the above reasons, there's still a possibility of spurious failures. The deletion will fail if anyone holds a handle to any of the files or folders being deleted, and there are many reasons why someone may be enumerating the folder or reading its files:

  • search indexers
  • anti-viruses
  • backup software

The general approach to deal with spurious failures is to try multiple times, pausing between the attempts. You obviously don't want to keep trying forever, so you should give up after a certain number of attempts and either throw an exception or ignore the error. Like this:

private static void DeleteRecursivelyWithMagicDust(string destinationDir) {
    const int magicDust = 10;
    for (var gnomes = 1; gnomes <= magicDust; gnomes++) {
        try {
            Directory.Delete(destinationDir, true);
        } catch (DirectoryNotFoundException) {
            return;  // good!
        } catch (IOException) { // System.IO.IOException: The directory is not empty
            System.Diagnostics.Debug.WriteLine("Gnomes prevent deletion of {0}! Applying magic dust, attempt #{1}.", destinationDir, gnomes);

            // see http://stackoverflow.com/questions/329355/cannot-delete-directory-with-directory-deletepath-true for more magic
            Thread.Sleep(50);
            continue;
        }
        return;
    }
    // depending on your use case, consider throwing an exception here
}

In my opinion, a helper like that should be used for all deletions because spurious failures are always possible. However, YOU SHOULD ADAPT THIS CODE TO YOUR USE CASE, not just blindly copy it.

I had spurious failures for an internal data folder generated by my app, located under %LocalAppData%, so my analysis goes like this:

  1. The folder is controlled solely by my application, and the user has no valid reason to go and mark things as read-only or inaccessible inside that folder, so I don't try to handle that case.

  2. There's no valuable user-created stuff in there, so there's no risk of forcefully deleting something by mistake.

  3. Being an internal data folder, I don't expect it to be open in explorer, at least I don't feel the need to specifically handle the case (i.e. I'm fine handling that case via support).

  4. If all attempts fail, I choose to ignore the error. Worst case, the app fails to unpack some newer resources, crashes and prompts the user to contact support, which is acceptable to me as long as it does not happen often. Or, if the app does not crash, it will leave some old data behind, which again is acceptable to me.

  5. I choose to limit retries to 500ms (50 * 10). This is an arbitrary threshold which works in practice; I wanted the threshold to be short enough so that users wouldn't kill the app, thinking that it has stopped responding. On the other hand, half a second is plenty of time for the offender to finish processing my folder. Judging from other SO answers which sometimes find even Sleep(0) to be acceptable, very few users will ever experience more than a single retry.

  6. I retry every 50ms, which is another arbitrary number. I feel that if a file is being processed (indexed, checked) when I try to delete it, 50ms is about the right time to expect the processing to be completed in my case. Also, 50ms is small enough to not result in a noticeable slowdown; again, Sleep(0) seems to be enough in many cases, so we don't want to delay too much.

  7. The code retries on any IO exceptions. I don't normally expect any exceptions accessing %LocalAppData%, so I chose simplicity and accepted the risk of a 500ms delay in case a legitimate exception happens. I also didn't want to figure out a way to detect the exact exception that I want to retry on.

Andrey Tarantsov
  • 8,965
  • 7
  • 54
  • 58
  • 8
    P.P.S. A few months later, I'm happy to report that this (somewhat insane) piece of code has completely solved the issue. Support requests about this problem are down to zero (from about 1-2 per week). – Andrey Tarantsov May 20 '13 at 09:58
  • 1
    +0 While this is a more robust and less 'here it is; the perfect solution for you' than http://stackoverflow.com/a/7518831/11635, for me the same applies - programming by coincidence - handle with care. One useful point embodied in your code is that if you are going to do a retry, you do need to consider that you are in a race with the ambiguity of whether the Directory has 'Gone' since the last attempt [and a niave `Directory.Exists` guard would not resolve that.] – Ruben Bartelink May 20 '13 at 12:26
  • 1
    love it ... don't know what I'm doing that this is always a pain point for me ... but it is not because I have the directory open in explorer ... not much uproar on the internet about this more-or-less bug ... at least me and Andrey have a way to deal with it :) – TCC Sep 25 '13 at 22:06
  • 1
    This is probably the most concise yet complete answer still dealing with the odd cases of directories randomly refusing to get deleted. Also, I've witnessed the issue of stubborn directory on various occasions on different machines. I have found nowhere any decent documentation on why exactly this happens, but I guess it's always related to file hooks on OS level like anti-virus or Windows/Google (rip) desktop search. – Grimace of Despair Jan 14 '14 at 04:56
  • 2
    @RubenBartelink While this code is pretty random (and 500ms delay may well not be enough), I really don't see any sane way to deal with a file system that does not allow to delete open items. It's not like Windows has an API to obtain exclusive access to a folder. Any solution that works in practice, does not generate support issues and has reasonable worst-case behavior is quite an acceptable one in my book. – Andrey Tarantsov Jan 20 '14 at 08:02
  • 1
    @RubenBartelink But after thinking about it further, I've expanded the answer to include the full reasoning that goes into a helper like that. Hope you like it better now. :-) – Andrey Tarantsov Jan 20 '14 at 08:56
  • @AndreyTarantsov +1 Yes, much better now even if the 10x and the 50ms are very debatable - lets hope the votes go the right way because some of the others are just shockingly bad – Ruben Bartelink Jan 20 '14 at 11:04
  • @RubenBartelink Just out of curiosity, what would you use instead of 10x and 50ms? I understand it won't work for every case, and perhaps I should have posted a more generally applicable, universal version. I just expect the developers to apply their brains when reusing the code they find.. – Andrey Tarantsov Jan 20 '14 at 13:46
  • 1
    @AndreyTarantsov Apply brains on what basis? How did your brain come up with 10 and 50? For me, the inputs are: 1) `Sleep(0)` yields - `Sleep(1)` or `Sleep(50)` doesn't sleep any better (but if its released after 1 ms you've wasted 49 2) if its not done in 50ms why will it be done in 500ish? 3) is it important to delete it? I would drive the loop by total elapsed time (into the seconds) with short sleeps (prob exponential). And I'd wrap it up in a method that takes a TimeSpan and let the caller be *forced* to think. – Ruben Bartelink Jan 20 '14 at 14:08
  • Because people reading this Q are looking for their magic and *will* paste your code _and_ they will not edit it. The fact that you put 10 and 50 in there will make many people think (yes, for invalid reasons) they are important numbers with some relevance either a) necessary or b) sufficient to make it work. They are neither and hence should not be there. Now, don't make me answer this goddam question! – Ruben Bartelink Jan 20 '14 at 14:13
  • 2
    @RubenBartelink OK, so I think we can agree on this: posting a piece of code that works for one specific app (and was never meant to be suitable for every case) as an SO answer is going to be a disservice to many novice and/or ignorant developers. I gave it as a starting point for customization, but yeah, some people are going to use it as is, and that's a bad thing. – Andrey Tarantsov Jan 20 '14 at 14:47
  • I like the idea. It solved my problem and is now part of a helper library I regularily use. I made the wait time and repeat count parameters with reasonable defaults (20 repeats, 50ms) based on experiments: having an explorer open 6 levels deep in the directory to delete would need 8 attempts to delete the directory on Win7/quadcore. I did not like the enumeration approach from @ryascl because it (probably) introduces overhead when a simple delete would work, usually the majority of cases. – Peter - Reinstate Monica Oct 16 '14 at 09:31
  • I needed to delete the iis folders... its work for me. – Arthur Menezes May 27 '15 at 12:24
  • Beware, it will fail with a silent exception. Use `if (gnomes == magicDust) throw;`. Btw I find this code the best, since it is the most gentle solution. It is only more patient than the the default recursive directory delete, but not forcing anything like the other solutions. – nopara73 Jan 09 '17 at 10:33
  • 2
    @nopara You don't need the comparison; if we're out of the loop, we've failed. And yes, in many cases you will want to throw exception, then add appropriate error handling code up the stack, likely with a user-visible message. – Andrey Tarantsov Jan 09 '17 at 16:27
  • Why deleting a directory recursively cares whether a file or subdirectory is read-only or not seems completely arbitrary and stupid; now, if the _parent_ directory were read-only, that would make sense. This answer works for me regarding Delete's other problems. – Suncat2000 May 04 '21 at 15:16
23

Modern Async Answer

The accepted answer is just plain wrong, it might work for some people because the time taken to get files from disk frees up whatever was locking the files. The fact is, this happens because files get locked by some other process/stream/action. The other answers use Thread.Sleep (Yuck) to retry deleting the directory after some time. This question needs revisiting with a more modern answer.

public static async Task<bool> TryDeleteDirectory(
   string directoryPath,
   int maxRetries = 10,
   int millisecondsDelay = 30)
{
    if (directoryPath == null)
        throw new ArgumentNullException(nameof(directoryPath));
    if (maxRetries < 1)
        throw new ArgumentOutOfRangeException(nameof(maxRetries));
    if (millisecondsDelay < 1)
        throw new ArgumentOutOfRangeException(nameof(millisecondsDelay));

    for (int i = 0; i < maxRetries; ++i)
    {
        try
        {
            if (Directory.Exists(directoryPath))
            {
                Directory.Delete(directoryPath, true);
            }

            return true;
        }
        catch (IOException)
        {
            await Task.Delay(millisecondsDelay);
        }
        catch (UnauthorizedAccessException)
        {
            await Task.Delay(millisecondsDelay);
        }
    }

    return false;
}

Unit Tests

These tests show an example of how a locked file can cause the Directory.Delete to fail and how the TryDeleteDirectory method above fixes the problem.

[Fact]
public async Task TryDeleteDirectory_FileLocked_DirectoryNotDeletedReturnsFalse()
{
    var directoryPath = Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString());
    var subDirectoryPath = Path.Combine(Path.GetTempPath(), "SubDirectory");
    var filePath = Path.Combine(directoryPath, "File.txt");

    try
    {
        Directory.CreateDirectory(directoryPath);
        Directory.CreateDirectory(subDirectoryPath);

        using (var fileStream = new FileStream(filePath, FileMode.Create, FileAccess.Write, FileShare.Write))
        {
            var result = await TryDeleteDirectory(directoryPath, 3, 30);
            Assert.False(result);
            Assert.True(Directory.Exists(directoryPath));
        }
    }
    finally
    {
        if (Directory.Exists(directoryPath))
        {
            Directory.Delete(directoryPath, true);
        }
    }
}

[Fact]
public async Task TryDeleteDirectory_FileLockedThenReleased_DirectoryDeletedReturnsTrue()
{
    var directoryPath = Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString());
    var subDirectoryPath = Path.Combine(Path.GetTempPath(), "SubDirectory");
    var filePath = Path.Combine(directoryPath, "File.txt");

    try
    {
        Directory.CreateDirectory(directoryPath);
        Directory.CreateDirectory(subDirectoryPath);

        Task<bool> task;
        using (var fileStream = new FileStream(filePath, FileMode.Create, FileAccess.Write, FileShare.Write))
        {
            task = TryDeleteDirectory(directoryPath, 3, 30);
            await Task.Delay(30);
            Assert.True(Directory.Exists(directoryPath));
        }

        var result = await task;
        Assert.True(result);
        Assert.False(Directory.Exists(directoryPath));
    }
    finally
    {
        if (Directory.Exists(directoryPath))
        {
            Directory.Delete(directoryPath, true);
        }
    }
}
Muhammad Rehan Saeed
  • 35,627
  • 39
  • 202
  • 311
  • Can you expand on what you mean by "modern"? What are the benefits of your approach? Why are the others, in your opinion wrong? – TinyRacoon Oct 25 '19 at 09:18
  • 3
    Others are not wrong. They just use older API's like `Thread.Sleep` which you should avoid today and use `async`/`await` with `Task.Delay` instead. That's understandable, this is a very old question. – Muhammad Rehan Saeed Oct 25 '19 at 14:08
  • This approach won't work in VB.Net (at least not with a very literal line-for-line conversion) due to `BC36943 'Await' cannot be used inside a 'Catch' statement, a 'Finally' statement, or a 'SyncLock' statement.` – amonroejj Nov 04 '19 at 20:15
  • @amonroejj You must be using an older version. That was fixed. – Muhammad Rehan Saeed Nov 05 '19 at 08:58
  • 1
    Little improvement instead of return true `if (!Directory.Exists(directoryPath)) { return true; } await Task.Delay(millisecondsDelay);` to wait until the directory is really gone – fuchs777 May 27 '20 at 13:58
  • 1
    if (directoryPath == null) throw new ArgumentNullException(nameof(directoryPath)); << added nameof if (maxRetries < 1) ... – Morten Strand Apr 18 '23 at 11:40
  • If going async, might I suggest also using the DeleteAsync method, and also taking a CancellationToken as a parameter to the function (and using it), so a caller can terminate the operation if necessary? – dodexahedron Jul 20 '23 at 02:46
18

One important thing which should be mentioned (I'd added it as a comment but I'm not allowed to) is that the overload's behavior changed from .NET 3.5 to .NET 4.0.

Directory.Delete(myPath, true);

Starting from .NET 4.0 it deletes files in the folder itself but NOT in 3.5. This can be seen in the MSDN documentation as well.

.NET 4.0

Deletes the specified directory and, if indicated, any subdirectories and files in the directory.

.NET 3.5

Deletes an empty directory and, if indicated, any subdirectories and files in the directory.

jettatore
  • 371
  • 3
  • 5
  • 3
    I think it's only a documentation change... if it deletes only an "empty directory", what would mean deleting also files in the directory, with the 2° parameter? If it's empty there are no files... – Stefano Feb 16 '17 at 11:02
  • I'm afraid you're assuming wrong. I've posted this after testing the code with both framework versions. Deleting a non-empty folder in 3.5 will throw an exception. – jettatore Feb 22 '17 at 11:14
15

I had the very same problem under Delphi. And the end result was that my own application was locking the directory I wanted to delete. Somehow the directory got locked when I was writing to it (some temporary files).

The catch 22 was, I made a simple change directory to it's parent before deleting it.

Drejc
  • 14,196
  • 16
  • 71
  • 106
13

You can reproduce the error by running:

Directory.CreateDirectory(@"C:\Temp\a\b\c\");
Process.Start(@"C:\Temp\a\b\c\");
Thread.Sleep(1000);
Directory.Delete(@"C:\Temp\a\b\c");
Directory.Delete(@"C:\Temp\a\b");
Directory.Delete(@"C:\Temp\a");

When trying to delete directory 'b', it throws the IOException "The directory is not empty". That's stupid since we just deleted the directory 'c'.

From my understanding, the explanation is that directory 'c' is stamped as deleted. But the delete is not yet commited in the system. The system has reply the job is done, while in fact, it is still processing. The system probably wait the file explorer has focus on the parent directory to commit the delete.

If you look on the source code of the Delete function (http://referencesource.microsoft.com/#mscorlib/system/io/directory.cs) you will see it uses the native Win32Native.RemoveDirectory function. This do-not-wait behavior is noted here :

The RemoveDirectory function marks a directory for deletion on close. Therefore, the directory is not removed until the last handle to the directory is closed.

(http://msdn.microsoft.com/en-us/library/windows/desktop/aa365488(v=vs.85).aspx)

Sleep and retry is the solution. Cf the ryascl's solution.

Olivier de Rivoyre
  • 1,579
  • 1
  • 18
  • 24
10

I'm surprised that no one thought of this simple non-recursive method, which can delete directories containing read only files, without needing to change read only attribute of each of them.

Process.Start("cmd.exe", "/c " + @"rmdir /s/q C:\Test\TestDirectoryContainingReadOnlyFiles"); 

(Change a bit to not to fire a cmd window momentarily, which is available all over the internet)

Piyush Soni
  • 1,356
  • 3
  • 17
  • 40
  • Nice to share with us but would you be so kind as to include the bit of change needed to prevent firing the cmd window, instead of prompting us to search for it over the net? – ThunderGr Nov 28 '12 at 11:55
  • This doesn't work. In the same situation where I can delete the file from a command prompt or Explorer, using this code to call rmdir gives exit code 145 which translates to "The directory is not empty". It leaves the directory empty but still in place too, exactly like Directory.Delete("", true) – Kevin Coulombe Feb 16 '13 at 05:42
  • @Kevin Coulombe, Humm ... Are you sure you are using the /s/q switches? – Piyush Soni Feb 19 '13 at 05:04
  • @Piyush Soni : The content of the directory was deleted correctly. It only left the empty directory in place and it returned error 145. From what I gathered on the MS forums, it returns this error in many different scenarios. I didn't investigate more than that though. I use third party COM components so maybe one of them keeps file handles open when it shouldn't. In any event, I had to make sure not to leak disk space over time so I went for a garbage collection module to make sure the file got deleted later if it failed for whatever reason. Sorry I can't provide more information. – Kevin Coulombe Feb 19 '13 at 06:50
  • 1
    @KevinCoulombe: Yes, it must be those COM components. When I try through plain old C#, it works and it does delete the directory along with the files inside (read only or non-read only ones). – Piyush Soni Feb 19 '13 at 19:09
  • Sounds like a bad idea for me coz this solution won't work on Mono/Linux – frenchone Mar 15 '13 at 14:09
  • @frenchone: And that somehow makes it a bad *idea*? I'm sure there would be equivalent commands to do the same on linux with mono. – Piyush Soni Mar 15 '13 at 19:37
  • 5
    If you start to rely on external components for what should be in the framework then it's a "less than ideal" idea coz it's not portable anymore (or more difficult). What if the exe are not there ? Or the /option changed ? If the solution by Jeremy Edwards works then it should be preferred IMHO – frenchone Mar 19 '13 at 17:53
  • Another good alternative : https://stackoverflow.com/a/648055/1417104 – Kevin Dimey Mar 18 '21 at 15:28
8

I had a those weird permission problems deleting User Profile directories (in C:\Documents and Settings) despite being able to do so in the Explorer shell.

File.SetAttributes(target_dir, FileAttributes.Normal);
Directory.Delete(target_dir, false);

It makes no sense to me what a "file" operation does on a directory, but I know that it works and that's enough for me!

p.campbell
  • 98,673
  • 67
  • 256
  • 322
  • 2
    Still no hope, when the directory have lots of files and Explorer is opening the folder containing those files. – sees Mar 14 '13 at 10:55
3

Recursive directory deletion that does not delete files is certainly unexpected. My fix for that:

public class IOUtils
{
    public static void DeleteDirectory(string directory)
    {
        Directory.GetFiles(directory, "*", SearchOption.AllDirectories).ForEach(File.Delete);
        Directory.Delete(directory, true);
    }
}

I experienced cases where this helped, but generally, Directory.Delete deletes files inside directories upon recursive deletion, as documented in msdn.

From time to time I encounter this irregular behavior also as a user of Windows Explorer: Sometimes I cannot delete a folder (it think the nonsensical message is "access denied") but when I drill down and delete lower items I can then delete the upper items as well. So I guess the code above deals with an OS anomaly - not with a base class library issue.

citykid
  • 9,916
  • 10
  • 55
  • 91
  • Thanks. That's helpful but it still doesn't deal with `Delete()` throwing the exception. – Suncat2000 May 04 '21 at 14:37
  • yes. harder cases require harder measures - if a file for instance is locked one might alert, retry and so on. that was just a fix in one - presumably common - case. – citykid May 05 '21 at 16:31
3

This answer is based on: https://stackoverflow.com/a/1703799/184528. The difference with my code, is that we only recurse many delete sub-directories and files when necessary a call to Directory.Delete fails on a first attempt (which can happen because of windows explorer looking at a directory).

    public static void DeleteDirectory(string dir, bool secondAttempt = false)
    {
        // If this is a second try, we are going to manually 
        // delete the files and sub-directories. 
        if (secondAttempt)
        {
            // Interrupt the current thread to allow Explorer time to release a directory handle
            Thread.Sleep(0);

            // Delete any files in the directory 
            foreach (var f in Directory.GetFiles(dir, "*.*", SearchOption.TopDirectoryOnly))
                File.Delete(f);

            // Try manually recursing and deleting sub-directories 
            foreach (var d in Directory.GetDirectories(dir))
                DeleteDirectory(d);

            // Now we try to delete the current directory
            Directory.Delete(dir, false);
            return;
        }

        try
        {
            // First attempt: use the standard MSDN approach.
            // This will throw an exception a directory is open in explorer
            Directory.Delete(dir, true);
        }
        catch (IOException)
        {
            // Try again to delete the directory manually recursing. 
            DeleteDirectory(dir, true);
        }
        catch (UnauthorizedAccessException)
        {
            // Try again to delete the directory manually recursing. 
            DeleteDirectory(dir, true);
        } 
    }
Community
  • 1
  • 1
cdiggins
  • 17,602
  • 7
  • 105
  • 102
  • So how is it supposed to delete the folder if there was an `UnauthorizedAccessException`? It would just throw, again. And again. And again... Because each time it's going to go to the `catch` and call the function again. A `Thread.Sleep(0);` doesn't change your permissions. It should just log the error and fail gracefully, at that point. And this loop will just continue as long as the (sub-)directory is open - it does not close it programmatically. Are we prepared to just let it do this for as long as those things are left open? Is there a better way? – vapcguy Jan 26 '17 at 15:54
  • If there is an `UnauthorizedAccessException` it will manually try to delete each file manually. So it continues to make progress by traversing into the directory structure. Yes, potentially every file and directory will throw the same exception, but this can also occur simply because explorer is holding a handle to it (see http://stackoverflow.com/a/1703799/184528) I will change the "tryAgain" to "secondTry" to make it more clear. – cdiggins Jan 26 '17 at 17:11
  • To answer more succintly, it passes "true" and executes a different code path. – cdiggins Jan 26 '17 at 17:12
  • Right, saw your edit, but my point isn't with the deletion of files, but with the deletion of the directory. I wrote some code where I could do essentially `Process.Kill()` on any process a file may be locked by, and delete the files. Problem I run into is when deleting a directory where one of those files was still open (see http://stackoverflow.com/questions/41841590/delete-a-directory-where-someone-has-opened-a-file). So going back through this loop, no matter what else it's doing, if it does `Directory.Delete()` on that folder again, it will still fail if that handle can't be released. – vapcguy Jan 26 '17 at 19:13
  • And same would occur for an `UnauthorizedAccessException` since deleting files (assuming this was even allowed, because to get to that code, it failed on `Directory.Delete()`) doesn't magically give you permission to delete the directory. – vapcguy Jan 26 '17 at 19:14
3

Non of above solutions worked well for me. I ended up by using an edited version of @ryascl solution as below:

    /// <summary>
    /// Depth-first recursive delete, with handling for descendant 
    /// directories open in Windows Explorer.
    /// </summary>
    public static void DeleteDirectory(string path)
    {
        foreach (string directory in Directory.GetDirectories(path))
        {
            Thread.Sleep(1);
            DeleteDir(directory);
        }
        DeleteDir(path);
    }

    private static void DeleteDir(string dir)
    {
        try
        {
            Thread.Sleep(1);
            Directory.Delete(dir, true);
        }
        catch (IOException)
        {
            DeleteDir(dir);
        }
        catch (UnauthorizedAccessException)
        {
            DeleteDir(dir);
        }
    }
cahit beyaz
  • 4,829
  • 1
  • 30
  • 25
2

As mentioned above the "accepted" solution fails on reparse points. There's a much shorter solution that properly replicates the functionality:

public static void rmdir(string target, bool recursive)
{
    string tfilename = Path.GetDirectoryName(target) +
        (target.Contains(Path.DirectorySeparatorChar.ToString()) ? Path.DirectorySeparatorChar.ToString() : string.Empty) +
        Path.GetRandomFileName();
    Directory.Move(target, tfilename);
    Directory.Delete(tfilename, recursive);
}

I know, doesn't handle the permissions cases mentioned later, but for all intents and purposes FAR BETTER provides the expected functionality of the original/stock Directory.Delete() - and with a lot less code too.

You can safely carry on processing because the old dir will be out of the way ...even if not gone because the 'file system is still catching up' (or whatever excuse MS gave for providing a broken function).

As a benefit, if you know your target directory is large/deep and don't want to wait (or bother with exceptions) the last line can be replaced with:

    ThreadPool.QueueUserWorkItem((o) => { Directory.Delete(tfilename, recursive); });

You are still safe to carry on working.

Jean-François Fabre
  • 137,073
  • 23
  • 153
  • 219
Rob
  • 444
  • 3
  • 10
  • 3
    Can your assignment be simplified by: string tfilename = Path.Combine(Path.GetDirectoryName(target), Path.GetRandomFileName()); – Pete Jun 29 '16 at 09:19
  • 1
    I have to agree with Pete. Code as written will not add the separator. It took my path of `\\server\C$\dir` and made it `\\server\C$asf.yuw`. As a result I got an error on the `Directory.Move()` -- `Source and destination path must have identical roots. Move will not work across volumes.` Worked fine once I used Pete's code EXCEPT neither handles for when there are locked files or open directories-so it never gets to the `ThreadPool` command. – vapcguy Jan 24 '17 at 18:18
  • 1
    CAUTION: This answer should only be used with recursive=true. When false, this will Move the directory even if it is not empty. Which would be a bug; correct behavior in that case is to throw an exception, and leave the directory as it was. – ToolmakerSteve Apr 30 '18 at 12:18
2

Is it possible you have a race condition where another thread or process is adding files to the directory:

The sequence would be:

Deleter process A:

  1. Empty the directory
  2. Delete the (now empty) directory.

If someone else adds a file between 1 & 2, then maybe 2 would throw the exception listed?

Douglas Leeder
  • 52,368
  • 9
  • 94
  • 137
2

You don't have to create and extra method for recursivity or delete files inside folder extra. This all doing automatically by calling

DirectoryInfo.Delete();

Details is here.

Something like this works quite good:

  var directoryInfo = new DirectoryInfo("My directory path");
    // Delete all files from app data directory.

    foreach (var subDirectory in directoryInfo.GetDirectories())
    {
          subDirectory.Delete(true);// true set recursive paramter, when it is true delete sub file and sub folder with files too
    }

passing true as variable to delete method, will delete sub files and sub folder with files too.

nzrytmn
  • 6,193
  • 1
  • 41
  • 38
2

I have spent few hours to solve this problem and other exceptions with deleting the directory. This is my solution

 public static void DeleteDirectory(string target_dir)
    {
        DeleteDirectoryFiles(target_dir);
        while (Directory.Exists(target_dir))
        {
            lock (_lock)
            {
                DeleteDirectoryDirs(target_dir);
            }
        }
    }

    private static void DeleteDirectoryDirs(string target_dir)
    {
        System.Threading.Thread.Sleep(100);

        if (Directory.Exists(target_dir))
        {

            string[] dirs = Directory.GetDirectories(target_dir);

            if (dirs.Length == 0)
                Directory.Delete(target_dir, false);
            else
                foreach (string dir in dirs)
                    DeleteDirectoryDirs(dir);
        }
    }

    private static void DeleteDirectoryFiles(string target_dir)
    {
        string[] files = Directory.GetFiles(target_dir);
        string[] dirs = Directory.GetDirectories(target_dir);

        foreach (string file in files)
        {
            File.SetAttributes(file, FileAttributes.Normal);
            File.Delete(file);
        }

        foreach (string dir in dirs)
        {
            DeleteDirectoryFiles(dir);
        }
    }

This code has the small delay, which is not important for my application. But be careful, the delay may be a problem for you if you have a lot of subdirectories inside the directory you want to delete.

Demid
  • 45
  • 2
  • 8
    -1 What's the delay about? No programming by coincidence please! – Ruben Bartelink Feb 07 '12 at 08:51
  • @Ruben -1 for an unexplained delay? This is quite a punishment for something like that, IMO! – ThunderGr Nov 28 '12 at 11:36
  • The 4 upvotes on the comment are [for a reason](http://pragprog.com/the-pragmatic-programmer/extracts/coincidence). There is no reason to believe that you've come up with a solution that won't randomly fail. The `lock(_lock)` is similarly dubious for me. @Roger I disagree with your complaint too. Imagine if someone in your organisation lashed this code into your app and you discovered it at 4:30 PM after spending 20 minutes wondering why it sometimes works and sometimes inexplicably doesnt. The code is not the value, it's the trust in the care that went into it. So, *WHY the delay?* Comments! – Ruben Bartelink Nov 28 '12 at 11:54
  • 1
    @Ruben I did not say you are wrong about it. I just said that downvoting it just for this one is a harsh punishment. I do agree with you, however, the 4 upvotes had not resulted in 4 downvotes. I would upvote your comment as well, but I wouldn't downvote the answer because of an unexplained delay :) – ThunderGr Nov 28 '12 at 12:02
  • @ThunderGr The unexplained delay is just a specific example of Programming By Conicidence. I don't downvote to *punish* anyone - I have far better things to be doing on the planet. I wouldnt have downvoted if the score was 0, but I encountered the answer before more useful answers. The corollary to the why isnt there 4 downvotes is ... why aren't there any upvotes (please let us not start though!). Anyway, I won't be going any more [XKCD 386](http://xkcd.com/386/) on this for now. – Ruben Bartelink Nov 28 '12 at 12:55
  • 1
    @RubenBartelink and others: while I don't specifically like this code (I have posted another solution with a similar approach), the delay here is reasonable. The issue is most likely outside of the app's control; perhaps another app rescans the FS periodically, thus locking the folder for short periods of time. The delay solves the issue, getting the bug report count down to zero. Who cares if we have no frigging idea as to the root cause? – Andrey Tarantsov May 20 '13 at 09:56
  • 1
    @RubenBartelink In fact, when you think about it, _not_ using a delay-and-retry approach during NTFS directory deletion is an irresponsible solution here. Any kind of ongoing file traversal blocks the deletion, so it's bound to fail sooner or later. And you can't expect all third-party search, backup, antivirus and file management tools to stay out of your folder. – Andrey Tarantsov May 20 '13 at 10:10
  • @AndreyTarantsov Should the delay be 100 - why not 50 or 200; Why is yours 50? Why infinite here and 10 iters in yours? Would you like a random blocking algorithm like this in the framework? Would you like one that gives up without telling anyone (except a debugger?) ? Both this and your answer are equally debatable for me. – Ruben Bartelink May 20 '13 at 12:17
  • @RubenBartelink While I'm late to the party, to provide a rationale, there'd be no way to know how long any 1 *thing* (be it an anti-virus scanner, file backup routine, etc.) is going to lock a file. I suppose there might be statistics and one might take the highest lock time of all software, add in a small buffer for safety, and expect the application to be able to perform its function after that delay. But there's no way to know what software is on the target PC if we want something fully generic & one-size-fits-all. Maybe what is a good time with today's software won't be tomorrow. – vapcguy Jan 24 '17 at 16:51
  • 1
    @RubenBartelink Another ex., say you give a delay of 100ms, and the highest lock time of any software on the target PC is the AV software=90ms. Say it also has backup software that locks files for 70ms. Now the AV locks a file, your app waits 100ms, which is normally fine, but then encounters another lock because the backup software starts grabbing the file at the 70ms mark of the AV scan, and so will take another 40ms to release the file. So while the AV software takes longer & your 100ms is normally longer than either of the 2 apps, you still have to account for when it starts in the middle. – vapcguy Jan 24 '17 at 17:06
1

I've had this same problem with Windows Workflow Foundation on a build server with TFS2012. Internally, the workflow called Directory.Delete() with the recursive flag set to true. It appears to be network related in our case.

We were deleting a binary drop folder on a network share before re-creating and re-populating it with the latest binaries. Every other build would fail. When opening the drop folder after a failed build, the folder was empty, which indicates that every aspect of the Directory.Delete() call was successful except for deleting the actually directory.

The problem appears to be caused by the asynchronous nature of network file communications. The build server told the file server to delete all of the files and the file server reported that it had, even though it wasn't completely finished. Then the build server requested that the directory be deleted and the file server rejected the request because it hadn't completely finished deleting the files.

Two possible solutions in our case:

  • Build up the recursive deletion in our own code with delays and verifications between each step
  • Retry up to X times after an IOException, giving a delay before trying again

The latter method is quick and dirty but seems to do the trick.

Shaun
  • 667
  • 8
  • 15
1

This is because of FileChangesNotifications.

It happens since ASP.NET 2.0. When you delete some folder within an app, it gets restarted. You can see it yourself, using ASP.NET Health Monitoring.

Just add this code to your web.config/configuration/system.web:

<healthMonitoring enabled="true">
  <rules>
    <add name="MyAppLogEvents" eventName="Application Lifetime Events" provider="EventLogProvider" profile="Critical"/>
  </rules>
</healthMonitoring>


After that check out Windows Log -> Application. What is going on:

When you delete folder, if there is any sub-folder, Delete(path, true) deletes sub-folder first. It is enough for FileChangesMonitor to know about removal and shut down your app. Meanwhile your main directory is not deleted yet. This is the event from Log:


enter image description here


Delete() didn't finish its work and because app is shutting down, it raises an exception:

enter image description here

When you do not have any subfolders in a folder that you are deleting, Delete() just deletes all files and that folder, app is getting restarted too, but you don't get any exceptions, because app restart doesn't interrupt anything. But still, you lose all in-process sessions, app doesn't response to requests when restarting, etc.

What now?

There are some workarounds and tweaks to disable this behaviour, Directory Junction, Turning Off FCN with Registry, Stopping FileChangesMonitor using Reflection (since there is no exposed method), but they all don't seem to be right, because FCN is there for a reason. It is looking after structure of your app, which is not structure of your data. Short answer is: place folders you want to delete outside of your app. FileChangesMonitor will get no notifications and your app will not be restarted every time. You will get no exceptions. To get them visible from the web there are two ways:

  1. Make a controller that handles incoming calls and then serves files back by reading from folder outside an app (outside wwwroot).

  2. If your project is big and performance is most important, set up separate small and fast webserver for serving static content. Thus you will leave to IIS his specific job. It could be on the same machine (mongoose for Windows) or another machine (nginx for Linux). Good news is you don't have to pay extra microsoft license to set up static content server on linux.

Hope this helps.

Roman
  • 1,946
  • 3
  • 20
  • 28
1

It appears that having the path or subfolder selected in Windows Explorer is enough to block a single execution of Directory.Delete(path, true), throwing an IOException as described above and dying instead of booting Windows Explorer out to a parent folder and proceding as expected.

David Alpert
  • 3,161
  • 1
  • 23
  • 19
  • This appears to have been my problem. As soon as I closed Explorer and ran again, no exception. Even selecting the parent's parent wasn't enough. I had to actually close Explorer. – Scott Marlowe Feb 14 '13 at 00:16
  • Yes, this happens and is a cause. So any idea how to programmatically deal with it, or is the answer just always to make sure all 1000 users have that folder closed? – vapcguy Jan 24 '17 at 18:36
1

The directory or a file in it is locked and cannot be deleted. Find the culprit who locks it and see if you can eliminate it.

Vilx-
  • 104,512
  • 87
  • 279
  • 422
1

This problem can appear on Windows when there are files in a directory (or in any subdirectory) which path length is greater than 260 symbols.

In such cases you need to delete \\\\?\C:\mydir instead of C:\mydir. About the 260 symbols limit you can read here.

zx485
  • 28,498
  • 28
  • 50
  • 59
HostageBrain
  • 224
  • 2
  • 7
1

I had this problem today. It was happening because I had windows explorer open to the directory that was trying to be deleted, causing the recursive call the fail and thus the IOException. Make sure there are no handles open to the directory.

Also, MSDN is clear that you don't have to write your own recusion: http://msdn.microsoft.com/en-us/library/fxeahc5f.aspx

groksrc
  • 2,910
  • 1
  • 29
  • 29
0

If your application's (or any other application's) current directory is the one you're trying to delete, it will not be an access violation error but a directory is not empty. Make sure it's not your own application by changing the current directory; also, make sure the directory is not open in some other program (e.g. Word, excel, Total Commander, etc.). Most programs will cd to the directory of the last file opened, which would cause that.

configurator
  • 40,828
  • 14
  • 81
  • 115
0

I resolved one possible instance of the stated problem when methods were async and coded like this:

// delete any existing update content folder for this update
if (await fileHelper.DirectoryExistsAsync(currentUpdateFolderPath))
       await fileHelper.DeleteDirectoryAsync(currentUpdateFolderPath);

With this:

bool exists = false;                
if (await fileHelper.DirectoryExistsAsync(currentUpdateFolderPath))
    exists = true;

// delete any existing update content folder for this update
if (exists)
    await fileHelper.DeleteDirectoryAsync(currentUpdateFolderPath);

Conclusion? There is some asynchronous aspect of getting rid of the handle used to check existence that Microsoft has not been able to speak to. It's as if the asynchronous method inside an if statement has the if statement acting like a using statement.

0

in case of network files, Directory.DeleteHelper(recursive:=true) might cause IOException which caused by the delay of deleting file

crowdy
  • 321
  • 3
  • 9
0

I´ve solved with this millenary technique (you can leave the Thread.Sleep on his own in the catch)

bool deleted = false;
        do
        {
            try
            {
                Directory.Delete(rutaFinal, true);                    
                deleted = true;
            }
            catch (Exception e)
            {
                string mensaje = e.Message;
                if( mensaje == "The directory is not empty.")
                Thread.Sleep(50);
            }
        } while (deleted == false);
0

In addition to the many comprehensive answers here about Windows Explorer and file handles, I encountered the DirectoryNotFoundException when I had a path that was too long. For some reason our unzipping code can create paths 270 characters long. That's more than the max of 260 in my case. When I try to delete it with Directory.Delete(folder, recursive: true), it tells me it can't find the directory.

Edit: This wouldn't have been causing OP's issue though, because the error is not intermittent - it errors on every attempt.

-1

I solved the problem in this way:

                foreach (DirectoryInfo directoryInfo in directory.GetDirectories())
                {
                    var creationDate = DateTime.Now - directoryInfo.LastWriteTime;

                        while (true)
                        {
                            var files = Directory.GetFiles(directoryInfo.FullName);

                            if (files.Length == 0)
                            {
                                break;
                            }

                            Operations.ShellAdmin("DEL *.* /Q", directoryInfo.FullName);
                        }

                        directoryInfo.Delete();
                }
  • As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Jul 17 '23 at 11:43
  • What is the purpose of the (inaccurately named) creationDate variable, and what does this solution do to address the problem in the question that wasn't covered by other answers? Permission problems and other issues (like certain kinds of reparse point) will still cause this to fail, but now you don't get an exception that says why it failed. It can even get into an infinite loop in some failure cases. – dodexahedron Jul 20 '23 at 02:56
-2

None of the above answers worked for me. It appears that my own app's usage of DirectoryInfo on the target directory was causing it to remain locked.

Forcing garbage collection appeared to resolve the issue, but not right away. A few attempts to delete where required.

Note the Directory.Exists as it can disappear after an exception. I don't know why the delete for me was delayed (Windows 7 SP1)

        for (int attempts = 0; attempts < 10; attempts++)
        {
            try
            {
                if (Directory.Exists(folder))
                {
                    Directory.Delete(folder, true);
                }
                return;
            }
            catch (IOException e)
            {
                GC.Collect();
                Thread.Sleep(1000);
            }
        }

        throw new Exception("Failed to remove folder.");
Reactgular
  • 52,335
  • 19
  • 158
  • 208
  • 1
    -1 Programming by coincidence. What object does what when GC'd ? Is this in any way good general advice? (I believe you when you say you had a problem and that you used this code and that you feel you don't have a problem now but that's just not the point) – Ruben Bartelink Jan 20 '14 at 11:09
  • @RubenBartelink I agree. It's a hack. Voodoo code that does something when it's not clear what it's solving or how. I would love a proper solution. – Reactgular Jan 20 '14 at 11:59
  • 1
    My problem is that anything it adds over and above http://stackoverflow.com/a/14933880/11635 is highly speculative. If I could, I'd be giving a -1 for duplication and a -1 for speculation/programming by coincidence. Sprinkling `GC.Collect` is a) just Bad Advice and b) not a sufficiently common general cause of locked dirs to merit inclusion here. Just pick one of the others and don't sow more confusion in the minds of innocent readers – Ruben Bartelink Jan 20 '14 at 14:28
  • 3
    Use GC.WaitForPendingFinalizers(); after GC.Collect(); this will work as expected. – Heiner Apr 11 '14 at 11:16
  • Not sure, untested, but perhaps better would be to do something with a `using` statement, then: `using (DirectoryInfo di = new DirectoryInfo(@"c:\MyDir")) { for (int attempts = 0; attempts < 10; attempts++) { try { if (di.Exists(folder)) { Directory.Delete(folder, true); } return; } catch (IOException e) { Thread.Sleep(1000); } } }` – vapcguy Jan 26 '17 at 20:31