While testing some things out about writing/creating/deleting of files i made the following program that deletes and creates a file in a loop n times.
static string path = @"C:\Users\Remy\Desktop\Testing";
static readonly int sampleSize = 10000; // Amount of iterations the methods will be run for.
static byte[] sourceFile;
static void Main(string[] args)
{
using (FileStream fs = new FileStream(path + @"\SourceFile.txt", FileMode.Open, FileAccess.Read))
{
sourceFile = new byte[fs.Length];
fs.Read(sourceFile, 0, sourceFile.Length);
}
string filePath = path + @"\Destination.txt";
for (int i = 0; i < sampleSize; i++)
{
if (File.Exists(filePath))
{
File.SetAttributes(filePath, FileAttributes.Normal);
File.Delete(filePath);//Error sometimes throws here.
}
using (FileStream file = File.OpenWrite(filePath))
{
file.Write(sourceFile, 0, sourceFile.Length);
}
}
Console.ReadLine();
}
This program works most of the time as expected when the amount of iterations isn't too high (about 1000). It will delete the old file and create a new one.
However when I increase the amount of iterations to 10000/100000 issues arise where on rare occassion (about 0.03% of the times) it throws System.UnauthorizedAccessException
at using (FileStream file = File.OpenWrite(filePath))
, while succesfully passing the other 99.97% of the time. When the error throws the file doens't get created.
This happens both in VS (as admin) using Debug/release, and on the build .exe run as administrator.
When looking around this issue i found the following answers regarding Unauth...
exceptions.
- this answer suggests setting the Attributes, but as seen in my example I already do that.
- this and some other answers suggest running the application with admin rights. Which i'm already doing aswel.
I also added the permissions of the parent folder to allow full control
to Everyone
on all files and subfolders.
At first I thought maybe the file I was creating wasn't big enough (currently writing 976kb of random data) and for some reason the program iterated over creation/deletion faster than the OS/harddisk could handle. But the same behaviour occurs when increasing filesize
I've tested it across 3 machines, and it happened on them all.
Can this be a case of windows throwing an exception because of a false positive? Seeing that it only happens on big itterations? Am I missing something completely different here?
Note: I'm not looking for a way to handle the the exception. I can handle that. What I'm looking for a reason why this odd behaviour happens, and if possible to prevent it instead of curing it
Environment
Disk i'm writing to is a Crucial MX300 SSD, using sata 3 without a RAID. 16 GB ram. OS Windows 10 (pro) 64-bit. The system is running as idle as possible while running the program.
The Console application is targeting .NET Framework 4.6.1 build using Visual studio 2017 with Release Any CPU setting.
Additional things i've tried as per comment suggestions:
I tried adding a Thread.Sleep
after creation and deletion to make sure Windows gets to clear the file cache. This still throws the exception but this time it throws the exception on File.Delete(filePath);
instead.
Turning off Windows Defender etc also yields the result of the error being thrown on File.Delete(filePath)
instead of using(FIleStream....)
aswel.
Write to file using the following instead:
using (FileStream file = new FileStream(filePath, FileMode.Open, FileAccess.Write, FileShare.None))
{
file.Write(sourceFile, 0, sourceFile.Length); file.Flush(flushToDisk: true);
}
also yields the same Exception being thrown