1

In C# code that I am maintaining, I keep seeing this pattern:

File.Delete(string.Format("{0}/{1}.png", temp, ProductCode));
Thread.Sleep(20);
File.Delete(string.Format("{0}/{1}.pdf", temp, ProductCode));
Thread.Sleep(20);
File.Delete(string.Format("{0}/{1} - {2}.pdf", temp, ProductCode, ProductName));
Thread.Sleep(20);

MSDN says nothing one way or the other and I would like a definitive opinion if anyone can help: is it necessary to yield the processor between these successive deletions?

In this particular case there is no possibility of locks from other processes; delete won't fail. I was, however, uncertain as to whether there will be appropriate queueing if I issue several delete operations on the same thread without pause.

Peter Wone
  • 17,965
  • 12
  • 82
  • 134
  • 1
    Not really sure. When creating the code it probably consumed a lot of resources, so they added a sleep. I would personally NOT use it. – kevintjuh93 Sep 23 '15 at 06:05
  • 1
    It's not necessary, but I would not remove them if I were you, unless the sum of all of those 20 ms delays is impacting the program negatively. They were put there for a purpose, probably because of bad hardware, network issues, etc. – Mark Lakata Sep 23 '15 at 06:08

1 Answers1

5

It is voodoo programming. Usually inspired by a programmer seeing that File.Delete() doesn't always actually delete the file. A pretty inevitable side-effect of running programs on a multi-tasking operating system, the kind that also lets other processes open files.

And there's a category of shrink-wrapped malware that programmers voluntarily install on their machines that can cause a delay. Anti-malware is on the top of that list, search indexers and cloud storage utilities are next.

They use the FileSharing.Delete option to open the file to try to minimize their impact. Tends to work okay, until you do things like trying to rewrite the file immediately after deleting it. That doesn't work, the file is in "delete pending" mode and you get slapped with an UnauthorizedAccessException.

The victim programmer then tends to notice that "waiting for a bit" helps to avoid the exception. Giving the other process enough time to finish whatever it is doing with the file and close the file handle. Once the last handle is closed, the file actually disappears from the file system and recreating it won't fail.

Sleeping for 20 millisecond is completely arbitrary and pretty unlikely to be enough. You can never know how long to wait, other than by repeatedly trying. The real problem is that this strategy is just a very broken way to recreate files. Deleting first, then creating and writing the file is very dangerous. As noted, deleting works fine but creating it is perilous. When this goes wrong, the user is left without a copy of the file. Complete data loss. Never a very desirable feature of a program :)

Don't use voodoo. The proper way to do it is to create a new file, then use File.Replace() to swap it with the old file, then delete the old file. Never any data loss, the malware can do little to screw this up. Deleting a file by moving it to the recycle bin is a fine approach as well, as long as it doesn't happen too often, FileSystem.DeleteFile() in everybody's favorite namespace.

Hans Passant
  • 922,412
  • 146
  • 1,693
  • 2,536
  • Oh, hello Hans, LTNS. In this specific case it's cleanup of temp files after a group of user selected documents is bundled, zipped and delivered as a download. It's on a server with no chance of interference of the types you mention. Great answer. I know most of that but I wasn't totally sure whether file operations would queue properly if I deliver them in rapid fire on the same thread. One would hope so... – Peter Wone Sep 23 '15 at 23:53
  • Well, don't do that either. Use FileOptions.DeleteOnClose and you never have to say sorry. – Hans Passant Sep 23 '15 at 23:59
  • Hans helpfully provided a code example for this [in a previous answer](http://stackoverflow.com/a/8964246/18192). – Brian Sep 24 '15 at 13:48
  • After doing the refactoring necessary to use FileOptions.DeleteOnClose I realised I was in a position to just use memory streams instead of file streams. Memory use is up but it's transient. – Peter Wone Oct 01 '15 at 23:18