0

Got another multi-step process I'm looking to streamline. Basically, I'm looking to build a Powershell script to do three things:

  1. Get-Childitem to look for folders with a specific name (we'll call it NAME1 as a placeholder)
  2. For each folder it finds that has the name, I want it to output the full directory to a TXT file (so that in the end I wind up with a text file that has a list of the results it found, with their full paths; so if it finds folders with "NAME1" in five different subdirectories of the folder I give it, I want the full path beginning with the drive letter and ending with "NAME1")
  3. Then I want it to take the list from the TXT file, and copy each file path to another drive and preserve directory structure

So basically, if it searches and finds this:

D:\TEST1\NAME1
D:\TEST7\NAME1
D:\TEST8\NAME1\

That's what I want to appear in the text file.

Then what I want it to do is to go through each line in the text file and plug the value into a Copy-Item (I'm thinking the source directory would get assigned to a variable), so that when it's all said and done, on the second drive I wind up with this:

E:\BACKUP\TEST1\NAME1
E:\BACKUP\TEST7\NAME1
E:\BACKUP\TEST8\NAME1\

So in short, I'm looking for a Get-Childitem that can define a series of paths, which Copy-Item can then use to back them up elsewhere.

I already have one way to do this, but the problem is it seems to copy everything every time, and since one of these drives is an SSD I only want to copy what's new/changed each time (not to mention that would save time when I need to run a backup):

$source = "C:\"
$target = "E:\BACKUP\"
$search = "NAME1"
$source_regex = [regex]::escape($source)
(gci $source -recurse | where {-not ($_.psiscontainer)} | select -expand fullname) -match "\\$search\\" |
foreach { 

$file_dest = ($_ | split-path -parent) -replace $source_regex,$target

if (-not (test-path $file_dest)){mkdir $file_dest}
copy-item $_ -Destination $file_dest -force -verbose
}

If there's a way to do this that wouldn't require writing out a TXT file each time I'd be all for that, but I don't know a way to do this the way I'm looking for except a Copy-Item.

I'd be very grateful for any help I can get with this. Thanks all!

tnpir4002
  • 71
  • 5
  • You mention "Get-Childitem to look for folders with a specific name" if so then your `gci` is very inefficient, could just do with "search all folders having `Name1` exactly": `gci -Filter Name1 -Recurse -Directory` // aside from that, what is the question? you're asking for help but it's not clear with what – Santiago Squarzon Jan 14 '23 at 15:14
  • "Get-ChildItem -Path "C:\" -Filter "*.*" -Recurse | %{$_.FullName}" (see: https://stackoverflow.com/a/41836207/724039 ) – Luuk Jan 14 '23 at 15:18
  • You code only copies each time the $seach folder. You are using -force which will write even when the file or folders exist. So each time a copy is performed also a check is made to see if folder exists and if not create a new folder. If you did this a different way you would have to write the code to check if folders exist. What you have I think is the best method. – jdweng Jan 14 '23 at 15:21
  • You want to copy only if the new file is newer than the file in the archive. Since you are using a SSD you only want to copy when the file changes. So best way is to get a list of files and then copy only when newer. See following which uses PS and XCopy : https://www.tutorialspoint.com/how-to-copy-only-updated-or-newer-files-with-powershell#:~:text=To%20copy%20only%20updated%20or%20newer%20files%20with%20PowerShell%2C%20we,and%20copy%20the%20latest%20file. – jdweng Jan 14 '23 at 15:32
  • I use a combination of Robocopy and XCopy for various things, but the problem is I've never been able to get them to do exactly what I'm looking for. I want them to find and copy only folders called NAME1, and then copy them to a destination drive while maintaining the directory structure. – tnpir4002 Jan 14 '23 at 15:41
  • Looks like I found some syntax that does the job: Get-ChildItem $source -Filter *$search* -Name -Recurse | ForEach-Object { robocopy "$source\$_" "$target\$_" /z /s } – tnpir4002 Jan 14 '23 at 16:02
  • Yes that looks a lot simpler you are overcomplicating it before. `Copy-Item` with `-Recurse` already preserves folder structure. The only problem is using a different folder name for the destination otherwise each name1 folder could be overwritten by a new name1 folder – Santiago Squarzon Jan 14 '23 at 16:05
  • I got close with this: "Get-ChildItem -Path $source -Filter *$search* -Name -Recurse | Copy-Item -Destination $target -Recurse" But it's giving me an error that it can't find the destination file and it's giving the directory of the BAT file I'm running instead of the actual destination folder. – tnpir4002 Jan 14 '23 at 16:23
  • @SantiagoSquarzon the way I've got this set up is it specifically preserves the directory structure, so whenever it finds a folder of NAME1 it re-creates the directory structure above it to keep everything separate. I'm doing this to back up files from a specific project so I can copy them to another system and restore them, in effect keeping the entire project portable. – tnpir4002 Jan 14 '23 at 16:43

1 Answers1

1

If I understand correctly, you want to copy all folders with a certain name, keeping the original folder structure in the destination path and copy only files that are newer than what is in the destination already.

Try

$source = 'C:\'
$target = 'E:\BACKUP\'
$search = 'NAME1'

# -ErrorAction SilentlyContinue because in the C:\ disk you are bound to get Access Denied on some paths
Get-ChildItem -Path $source -Directory -Recurse -Filter $search -ErrorAction SilentlyContinue | ForEach-Object {
    # construct the destination folder path
    $dest = Join-Path -Path $target -ChildPath $_.FullName.Substring($source.Length)
    # copy the folder including its files and subfolders (but not empty subfolders)
    # for more switches see https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
    robocopy $_.FullName $dest  /XO /S /R:0
}

If you don't want console output of robocopy you can silence it by appending 2>&1, so neither stdout nor stderr is echoed

If you want to keep a file after this with both the source paths and the destinations, I'd suggest doing

$source = 'C:\'
$target = 'E:\BACKUP\'
$search = 'NAME1'
$output = [System.Collections.Generic.List[object]]::new()

# -ErrorAction SilentlyContinue because in the C:\ disk you are bound to get Access Denied on some paths
Get-ChildItem -Path $source -Directory -Recurse -Filter $search -ErrorAction SilentlyContinue | ForEach-Object {
    # construct the destination folder path
    $dest = Join-Path -Path $target -ChildPath $_.FullName.Substring($source.Length)
    # add an object to the output list
    $output.Add([PsCustomObject]@{Source = $_.FullName; Destination = $dest })
    # copy the folder including its files and subfolders (but not empty subfolders)
    # for more switches see https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
    robocopy $_.FullName $dest  /XO /S /R:0
}
# write the output to csv file
$output | Export-Csv -Path 'E:\backup.csv' -NoTypeInformation
Theo
  • 57,719
  • 8
  • 24
  • 41