I started a thread over here, to ask about "concurrent" writes to an XML file, and it got flagged as a duplicate and referenced here, to a thread that talks about creating a lock file in the same folder as the write file, as a means of handling the situation.
This seems inelegant to me, to be writing to the network with a hidden file, especially when we have the ability to lock a file, just not the ability (it seems) to lock a file and then, you know, do something with it.
So, my thought is to take a different approach.
1: Lock the file with $file = [IO.File]::Open($path, 'Open', 'ReadWrite', 'None')
I have verified I can't lock it twice, so only one instance of my code can have a lock at any one time.
2: Copy-Item to the local temp folder.
3: Read that copy and append data as needed.
4: Save back over the temp file.
5: Remove the lock with $file.Close()
6: Immediately Copy-Item the temp file back over the original.
The risk seems to be between 5 & 6. Another instance could set a lock after the first instance removes the lock, but before it overwrites the file with the revised temp file.
Is that risk the reason for the separate lock file approach? Because then the "lock" stays in place until after the revisions are saved?
It all seems like so much nasty kludge for something that I would think .NET or Powershell should handle. I mean, a StreamReaderWriter that has a -lock parameter, and allows you to pull the file in, mess with it, and save it, just seems so basic and fundamental I can't believe it's something that isn't built in.