I'm currently trying to understand a problem I'm facing. I have the following code:
using (FileStream sw = File.Create(mypath))
{
sw.Write(source, 0, bytesRead);
sw.Flush();
}
This is used in a webservice that can be run multiple time thus (even at the same time).
The problem I'm facing is that I suddenly had one file that had a duplicated content (thus the original content was 2x inside the file).
As I can't reproduce the problem I'm wondering if there is a possible constellation (in multithreading) where despite using File.Create to create the stream that the content from another call to the underlying method could result in an appended (instead of an overwritten) content?
Edit: As it was asked I'm trying to explain a bit more about how the multiple calls could be possible.
A third party tool creates important files (.xml) and calls my webservice to transfer them onto a server. If that transfer fails for any reason, the third party tool tries again to transfer them. As I'm seeing multiple transfer attempts in the logs within minutes of each other one fear (which I can't proof if its wrong or true despite max. logging) I have is that the first call takes too long and the second call comes when the first one is still going on. Thus they overlap each other (sadly I can't find any proof for or vs. this with the logs I have available thus I'm going with the worst case scenario, that they DO overlap and thus possible racing conditions occur which lead to this question).