0

When I try to over write an existing file that exists in a folder I get an error saying

file is being used by another process

Does anybody know how I can override this. I have tried the following, but no luck:

Copy-Item $backupDirectoryPathAndFolderName "C:\Program Files\TESTFOLDER" -Recurse -Container -Force
Ansgar Wiechers
  • 193,178
  • 25
  • 254
  • 328
ED209
  • 588
  • 1
  • 9
  • 26
  • I guess you can not, unless you close the process which is consuming the file. Also, what is the value of $backupDirectoryPathAndFolderName is it possible that you are copying and writing to the same folder ? – Prageeth Saravanan Jan 23 '17 at 16:03
  • 1
    Did you try to find what use the file and stop it? – autosvet Jan 23 '17 at 16:03
  • Possible duplicate of [Force close files that are in use when the script runs](http://stackoverflow.com/questions/24180361/force-close-files-that-are-in-use-when-the-script-runs) – henrycarteruk Jan 23 '17 at 17:25

1 Answers1

0

If a file is open for writing, your only option is to release the lock. If the file is open remotely (use net file or openfiles to identify), you can close down the remote file session. If, however, it's in use by a local process, you'll need to use openfiles with global list enabled or something like handle.exe from SysInternals. Even then, you'll only identify the owning process. How you convince that process to close the handle is another problem altogether... unless you choose to kill the process, that is.

Simon Catlin
  • 2,141
  • 1
  • 13
  • 15