1

I want to use Get-FileHash to populate a set of hash for certain directories. Here is the code :

dir "C:\" -Recurse | Get-FileHash -Algorithm MD5

But it show below error :

Get-FileHash : The file 'C:\Intel\Logs\IntelCPHS.log' cannot be read: The process cannot access the file 'C:\Intel\Logs\IntelCPHS.log' because it is being used by another process. At :2 char:22 + dir "C:\" -Recurse | Get-FileHash -Algorithm MD5| Export-Csv -Path "C ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : ReadError: (C:\Intel\Logs\IntelCPHS.log:PSObject) [Write-Error], WriteErrorException + FullyQualifiedErrorId : FileReadError,Get-FileHash

Kindly help on this or is there any other alternative for populating hashes?

Alex_P
  • 2,580
  • 3
  • 22
  • 37
  • Did you try to stop the process which uses the IntelCPHS.log? – Alex_P Oct 19 '19 at 18:10
  • 2
    If the file is exclusively locked by something else -- which is what it sounds like here because `Get-FileHash` is read-only -- then nothing is going to be able to access it until the process locking the file closes it. Exclusively locked is exclusively locked. However, the utility of hashing a log file is questionable. I would expect a log file to naturally be different on every system since it's likely to contain timestamps. I would consider skipping the log file entirely. – Bacon Bits Oct 20 '19 at 02:52
  • @BaconBits Thanks for the explanation. Is there any way to generate the hash without closing the process? – useruseruser Oct 20 '19 at 03:52
  • 1
    @user3033044 Not easily. You could write a program that uses the shadow copy service to get a snapshot of the file. That's usually how backup programs save open files, but it would not necessarily be the most up to date. It would be much, much easier to find the process that has it locked and stop that. – Bacon Bits Oct 20 '19 at 05:08
  • Why do you want to calculate the hash of a file that's continually changing? – Mathias R. Jessen Jun 25 '21 at 18:26

3 Answers3

1

I haven't delved into the code of Get-FileHash but it propably opens the file with FileShare.Read flag only. This will fail if the file has already been opened with FileShare.Write or FileShare.ReadWrite. Sharing flags specified by subsequent processes must be compatible with flags used by original process that opened the file, see this QA for details.

A workaround is to explicitly create a stream using the desired flags and pass it to Get-FileHash parameter -InputStream:

Get-ChildItem 'C:\' -File -Recurse -PipelineVariable File | ForEach-Object {
       
    $stream = try {
        [IO.FileStream]::new( $File.FullName, [IO.FileMode]::Open, [IO.FileAccess]::Read, [IO.FileShare]::Read )
    }
    catch {
        # Fallback in case another process has opened the file with FileShare.ReadWrite flag.
        [IO.FileStream]::new( $File.FullName, [IO.FileMode]::Open, [IO.FileAccess]::Read, [IO.FileShare]::ReadWrite )
    }

    if( $stream ) {
        try {
            Get-FileHash -InputStream $stream -Algorithm MD5 | 
                Select-Object Algorithm, Hash, @{ Name = 'Path'; Expression = { $File.Fullname } }
        }
        finally {
            $stream.Close()
        }
    }
}

NOTES:

  • -PipelineVariable File is used to avoid disambiguities of the $_ variable in the catch block and in the Expression script block of the Get-FileHash call.
  • $stream = try { } catch { } is a convenient way to capture both the output of the try and the catch block without having to repeat the variable name. It is equivalent to try { $stream =[IO.FileStream]::new(...)} catch { $stream =[IO.FileStream]::new(...)}. This works for many other statements like if / else, switch and for too.
  • Using Select-Object, a calculated property is added to fix the Path property. When using -InputStream the Get-FileHash cmdlet has no idea of the path, so it would output an empty Path property instead.
  • It is always a good idea to explicitly close streams in finally block, due to the indeterministic nature of the garbage collector which eventually closes the file but possibly very late. Shared resources such as files should only be kept open as long as necessary to avoid unneccessary blocking of other processes.
  • On a side note, Get-Content doesn't have a problem to read files opened with FileShare.ReadWrite. It is propably worth investigating, how Get-FileHash is implemented compared to Get-Content and possibly create an issue in the PowerShell GitHub project.
zett42
  • 25,437
  • 3
  • 35
  • 72
  • 1
    Just ran into this same issue. 7z is splitting a large file into 100 parts. The first 38 are done and won't be changing again, but is still locked by 7z. Beyond Compare on the two folders has successfully pulled these 38 files, and I have hashed those on the remote server, but I wanted to be able to hash these first 38 on the source server for a comparison without waiting for several hours until 7z finishes the 100th file and unlocks all of them. I was able to solve the issue using your solution. Thanks. – SteveSims Feb 10 '23 at 19:07
0

As @Alex_P mentions, turn off the process, I believe it's IntelCpHeciSvc.exe. If you run into a lot of these, you could also try to run the CMDlet while booting Windows in "Safe Mode", by doing that you can get rid of a lot of background processes that may trigger this error.

Feel free to ask further questions if you need some additional help.

Thestorum
  • 3
  • 1
  • 4
0

With Sysinternals process explorer, you can search for the file handle and see which process is locking it.

js2010
  • 23,033
  • 6
  • 64
  • 66