90

Is there any way to copy a really large file (from one server to another) in PowerShell AND display its progress?

There are solutions out there to use Write-Progress in conjunction with looping to copy many files and display progress. However I can't seem to find anything that would show progress of a single file.

Any thoughts?

wonea
  • 4,783
  • 17
  • 86
  • 139
Jason Jarrett
  • 3,857
  • 1
  • 27
  • 28

12 Answers12

131

It seems like a much better solution to just use BitsTransfer, it seems to come OOTB on most Windows machines with PowerShell 2.0 or greater.

Import-Module BitsTransfer
Start-BitsTransfer -Source $Source -Destination $Destination -Description "Backup" -DisplayName "Backup"
Nacht
  • 3,342
  • 4
  • 26
  • 41
  • Great! Indeed, this also gives me a (powershell) progress indicator. – mousio Nov 24 '14 at 10:33
  • it wouldn't probably leverages BITS capabilities if you are not pulling source from remote location, but it works smooth. – mCasamento Mar 07 '15 at 16:04
  • 4
    Exactly what I was after- worked perfectly and gives a progress bar! – Shawson Jan 04 '16 at 12:44
  • I used this to copy VMs around in 2012 R2 Hyper-V free. – johnny Aug 26 '16 at 22:01
  • I'm using this on a Hyper-V Server with minimal GUI and it works great! – MemphiZ Dec 05 '16 at 15:25
  • 4
    This should be the top answer. – tyteen4a03 Sep 06 '17 at 14:32
  • Does not work with Powershell Core 6.1.0 at least. I get an import error saying "BitsTransfer.psd1 does not support current PowerShell edition 'Core'" – Manuzor Dec 06 '18 at 09:39
  • 2
    Strangely, when I run the command, nothing at all happens with no output? I use for example -Source "\\comp1\c$\folder" -Destionation "\\comp2\c$\folder" , any idea what could be wrong? I have access to both folders, no issue there. If I use copy-item it works but with no progress obviously. – Rakha Mar 29 '19 at 14:04
  • In Ps 5.1 I get: `user is not registered in Network.` I copy from Network share. I use the 2 Parameters `source` and `Destination`. – Timo Aug 07 '19 at 10:18
  • @Rakha Think `Start-BitsTransfer` only works on files not directories. Unfortunately you can't even specify wildcards for files. – Rituraj Borpujari Jul 17 '23 at 08:14
54

I haven't heard about progress with Copy-Item. If you don't want to use any external tool, you can experiment with streams. The size of buffer varies, you may try different values (from 2kb to 64kb).

function Copy-File {
    param( [string]$from, [string]$to)
    $ffile = [io.file]::OpenRead($from)
    $tofile = [io.file]::OpenWrite($to)
    Write-Progress -Activity "Copying file" -status "$from -> $to" -PercentComplete 0
    try {
        [byte[]]$buff = new-object byte[] 4096
        [long]$total = [int]$count = 0
        do {
            $count = $ffile.Read($buff, 0, $buff.Length)
            $tofile.Write($buff, 0, $count)
            $total += $count
            if ($total % 1mb -eq 0) {
                Write-Progress -Activity "Copying file" -status "$from -> $to" `
                   -PercentComplete ([long]($total * 100 / $ffile.Length))
            }
        } while ($count -gt 0)
    }
    finally {
        $ffile.Dispose()
        $tofile.Dispose()
        Write-Progress -Activity "Copying file" -Status "Ready" -Completed
    }
}
VeeTheSecond
  • 3,086
  • 3
  • 20
  • 16
stej
  • 28,745
  • 11
  • 71
  • 104
  • 6
    Interesting solution. When I tried it I received an error - Cannot convert value "2147483648" to type "System.Int32". Error: "Value was either too large or too small for an Int32." After replacing the [int] to a [long], it worked great. Thanks – Jason Jarrett Mar 23 '10 at 13:44
  • That means that you copy files bigger than 2GB? I guess so. I'm glad it works :) – stej Mar 23 '10 at 14:23
  • +1 simple solutions are best! Am copying big (8GB+) files across from one network location to another ... gigabit network ... (*indication only*) ... using blocks of 1Mb means network adapter runs at about 50% (I suspect some throttling on our switch) ... smaller blocks weren't great though. – Aidanapword Mar 11 '13 at 11:09
  • 1
    Small .NETy gripe: the finally should call Dispose() rather than Close(). Good solution though. I'm sad there's no built-in progress available. – TheXenocide May 07 '13 at 15:59
  • Right, missing using statement in PowerShell :) – stej May 09 '13 at 13:09
  • @TheXenocide Why? Stream.Close() just calls Stream.Dispose()? http://referencesource.microsoft.com/#mscorlib/system/io/stream.cs – Keith Hill Sep 25 '14 at 00:00
  • That's right, but basically whenever the object is IDisposable, I call Dispose(). Imo it's safer and also you don't need to know the internals. – stej Sep 25 '14 at 16:33
  • 5
    Well, this is more of a programming quip than a scripting one (if you choose to differentiate), but from a computer science point of view: you are relying on internal implementation details of an object which are not guaranteed and can change at any time, and additionally not following an established pattern for the public contract. This both violates a primary tenant of object oriented design and also ignores the public IDisposable contract (which you *are* supposed to know exists) that has well established best practices that state it should always be disposed. – TheXenocide Sep 25 '14 at 18:37
  • You should replace the `[io.file]::OpenWrite($to)` to `[io.file]::Create($to)` otherwise it will not overwrite files properly. – Scott Chamberlain Oct 16 '15 at 16:29
  • What does it mean "properly"? – stej Oct 22 '15 at 05:33
  • If anyone is trying to get this to work with a PSDrive you'll want to change $ffile initialization to (Get-ChildItem $from).OpenRead() and the $tofile initialization to (new-item $to -Type File -Force).OpenWrite() – Nick May 18 '17 at 13:52
  • I changed the [int]$total into [long]$total to take large files into account, like Jason suggested. – VeeTheSecond Jul 31 '20 at 01:49
  • Need to handle exception on 0 length files. Currently gets a divide by zero error when it come across files like that. – Jroonk Jul 08 '23 at 09:32
33

Alternativly this option uses the native windows progress bar...

$FOF_CREATEPROGRESSDLG = "&H0&"

$objShell = New-Object -ComObject "Shell.Application"

$objFolder = $objShell.NameSpace($DestLocation) 

$objFolder.CopyHere($srcFile, $FOF_CREATEPROGRESSDLG)
Tisho
  • 8,320
  • 6
  • 44
  • 52
Chris M
  • 331
  • 3
  • 2
  • 1
    This is brilliant, how do you specify the "ALWAYS OVERWRITE" flag for this method, is it possible? So it doesn't prompt when files exist. – Rakha Jun 06 '19 at 18:39
  • @Rakha you just need to pass 16 as sec param to CopyHere function like this: `$objFolder.CopyHere($srcFile, 16)` – ‌‌R‌‌‌. Jul 17 '21 at 04:37
33
cmd /c copy /z src dest

not pure PowerShell, but executable in PowerShell and it displays progress in percents

wonea
  • 4,783
  • 17
  • 86
  • 139
Jirka Jr.
  • 441
  • 4
  • 3
  • Great answer. I also used [this answer](https://stackoverflow.com/a/31015007/4163002) to output progress. – ZX9 Jan 07 '20 at 20:04
17

I amended the code from stej (which was great, just what i needed!) to use larger buffer, [long] for larger files and used System.Diagnostics.Stopwatch class to track elapsed time and estimate time remaining.

Also added reporting of transfer rate during transfer and outputting overall elapsed time and overall transfer rate.

Using 4MB (4096*1024 bytes) buffer to get better than Win7 native throughput copying from NAS to USB stick on laptop over wifi.

On To-Do list:

  • add error handling (catch)
  • handle get-childitem file list as input
  • nested progress bars when copying multiple files (file x of y, % if total data copied etc)
  • input parameter for buffer size

Feel free to use/improve :-)

function Copy-File {
param( [string]$from, [string]$to)
$ffile = [io.file]::OpenRead($from)
$tofile = [io.file]::OpenWrite($to)
Write-Progress `
    -Activity "Copying file" `
    -status ($from.Split("\")|select -last 1) `
    -PercentComplete 0
try {
    $sw = [System.Diagnostics.Stopwatch]::StartNew();
    [byte[]]$buff = new-object byte[] (4096*1024)
    [long]$total = [long]$count = 0
    do {
        $count = $ffile.Read($buff, 0, $buff.Length)
        $tofile.Write($buff, 0, $count)
        $total += $count
        [int]$pctcomp = ([int]($total/$ffile.Length* 100));
        [int]$secselapsed = [int]($sw.elapsedmilliseconds.ToString())/1000;
        if ( $secselapsed -ne 0 ) {
            [single]$xferrate = (($total/$secselapsed)/1mb);
        } else {
            [single]$xferrate = 0.0
        }
        if ($total % 1mb -eq 0) {
            if($pctcomp -gt 0)`
                {[int]$secsleft = ((($secselapsed/$pctcomp)* 100)-$secselapsed);
                } else {
                [int]$secsleft = 0};
            Write-Progress `
                -Activity ($pctcomp.ToString() + "% Copying file @ " + "{0:n2}" -f $xferrate + " MB/s")`
                -status ($from.Split("\")|select -last 1) `
                -PercentComplete $pctcomp `
                -SecondsRemaining $secsleft;
        }
    } while ($count -gt 0)
$sw.Stop();
$sw.Reset();
}
finally {
    write-host (($from.Split("\")|select -last 1) + `
     " copied in " + $secselapsed + " seconds at " + `
     "{0:n2}" -f [int](($ffile.length/$secselapsed)/1mb) + " MB/s.");
     $ffile.Close();
     $tofile.Close();
    }
}
Chris J
  • 30,688
  • 6
  • 69
  • 111
Graham Gold
  • 2,435
  • 2
  • 25
  • 34
  • Nice script, but it gives a divide by zero. I had to add: if ( $secselapsed -ne 0 ) { [single]$xferrate = (($total/$secselapsed)/1mb); } else { [single]$xferrate = 0.0 } – 79E09796 Jun 20 '13 at 10:11
  • Not something I've come across in my daily use of this code, what powershell version are you using? Does it ever work for you? Just curious. Anything that makes it more robust is fine by me :-) – Graham Gold Jun 21 '13 at 23:12
  • On Powershell 2.0.1.1 it did work intermittently, but most times not. It seemed it might be copying the first block too fast and then rounding down the $secelapsed. I've put in the update, might save someone some time. Thanks again, it's a useful script. – 79E09796 Jun 25 '13 at 08:39
  • I owe @stej for the original code that I adapted, but thanks :-) – Graham Gold Jun 25 '13 at 10:39
  • Nice script, but the divide by zero error is in the line: "{0:n2}" -f [int](($ffile.length/$secselapsed)/1mb) + " MB/s."); You check for $secselapsed -eq 0 up above in the script, but don't at that point. – Sako73 Mar 03 '15 at 02:05
  • You should replace the `[io.file]::OpenWrite($to)` to `[io.file]::Create($to)` otherwise it will not overwrite files properly. – Scott Chamberlain Oct 16 '15 at 16:29
  • @ScottChamberlain never had a single overwrite fail with the `::OpenWrite` method yet but will bear that in mind if I do – Graham Gold Oct 18 '15 at 15:12
  • @GrahamGold Copy a file, then copy a 2nd file with the same name but a smaller filesize. The resultant file will have the end of the first file tacked on after the end of the 2nd file. – Scott Chamberlain Oct 18 '15 at 15:17
  • @ScottChamberlain explains why I wouldn't hit it in my usage of the above code - that scenario isn't possible in the script where I use it, if overwrite option is chosen at runtime then each file existing file is renamed as a temp file name before new file copied then temp file removed on completion of copy. – Graham Gold Oct 18 '15 at 15:22
  • How do I get Start-BitsTransfer (really, Write-Progress) to log updates as text to standard output instead of just the PowerShell progress bar? – John Zabroski Feb 12 '18 at 20:04
9

Not that I'm aware of. I wouldn't recommend using copy-item for this anyway. I don't think it has been designed to be robust like robocopy.exe to support retry which you would want for extremely large file copies over the network.

Keith Hill
  • 194,368
  • 42
  • 353
  • 369
  • 1
    Valid point. In this particular case I'm not too worried about robustness. It's copying a 15gig file between two servers on the same back-plane. However in other situations I would definitely consider a more robust solution. – Jason Jarrett Mar 23 '10 at 13:28
4

Hate to be the one to bump an old subject, but I found this post extremely useful. After running performance tests on the snippets by stej and it's refinement by Graham Gold, plus the BITS suggestion by Nacht, I have decided that:

  1. I really liked Graham's command with time estimations and speed readings.
  2. I also really liked the significant speed increase of using BITS as my transfer method.

Faced with the decision between the two... I found that Start-BitsTransfer supported Asynchronous mode. So here is the result of my merging the two.

function Copy-File {
    # ref: https://stackoverflow.com/a/55527732/3626361
    param([string]$From, [string]$To)

    try {
        $job = Start-BitsTransfer -Source $From -Destination $To `
            -Description "Moving: $From => $To" `
            -DisplayName "Backup" -Asynchronous

        # Start stopwatch
        $sw = [System.Diagnostics.Stopwatch]::StartNew()
        Write-Progress -Activity "Connecting..."

        while ($job.JobState.ToString() -ne "Transferred") {
            switch ($job.JobState.ToString()) {
                "Connecting" {
                    break
                }
                "Transferring" {
                    $pctcomp = ($job.BytesTransferred / $job.BytesTotal) * 100
                    $elapsed = ($sw.elapsedmilliseconds.ToString()) / 1000

                    if ($elapsed -eq 0) {
                        $xferrate = 0.0
                    }
                    else {
                        $xferrate = (($job.BytesTransferred / $elapsed) / 1mb);
                    }

                    if ($job.BytesTransferred % 1mb -eq 0) {
                        if ($pctcomp -gt 0) {
                            $secsleft = ((($elapsed / $pctcomp) * 100) - $elapsed)
                        }
                        else {
                            $secsleft = 0
                        }

                        Write-Progress -Activity ("Copying file '" + ($From.Split("\") | Select-Object -last 1) + "' @ " + "{0:n2}" -f $xferrate + "MB/s") `
                            -PercentComplete $pctcomp `
                            -SecondsRemaining $secsleft
                    }
                    break
                }
                "Transferred" {
                    break
                }
                Default {
                    throw $job.JobState.ToString() + " unexpected BITS state."
                }
            }
        }

        $sw.Stop()
        $sw.Reset()
    }
    finally {
        Complete-BitsTransfer -BitsJob $job
        Write-Progress -Activity "Completed" -Completed
    }
}
SQB
  • 3,926
  • 2
  • 28
  • 49
Shane
  • 2,007
  • 18
  • 33
4

i found none of the examples above met my needs, i wanted to copy a directory with sub directories, the problem is my source directory had too many files so i quickly hit the BITS file limit (i had > 1500 file) also the total directory size was quite large.

i found a function using robocopy that was a good starting point at https://keithga.wordpress.com/2014/06/23/copy-itemwithprogress/, however i found it wasn't quite robust enough, it didn't handle trailing slashes, spaces gracefully and did not stop the copy when the script was halted.

Here is my refined version:

function Copy-ItemWithProgress
{
    <#
    .SYNOPSIS
    RoboCopy with PowerShell progress.

    .DESCRIPTION
    Performs file copy with RoboCopy. Output from RoboCopy is captured,
    parsed, and returned as Powershell native status and progress.

    .PARAMETER Source
    Directory to copy files from, this should not contain trailing slashes

    .PARAMETER Destination
    DIrectory to copy files to, this should not contain trailing slahes

    .PARAMETER FilesToCopy
    A wildcard expresion of which files to copy, defaults to *.*

    .PARAMETER RobocopyArgs
    List of arguments passed directly to Robocopy.
    Must not conflict with defaults: /ndl /TEE /Bytes /NC /nfl /Log

    .PARAMETER ProgressID
    When specified (>=0) will use this identifier for the progress bar

    .PARAMETER ParentProgressID
    When specified (>= 0) will use this identifier as the parent ID for progress bars
    so that they appear nested which allows for usage in more complex scripts.

    .OUTPUTS
    Returns an object with the status of final copy.
    REMINDER: Any error level below 8 can be considered a success by RoboCopy.

    .EXAMPLE
    C:\PS> .\Copy-ItemWithProgress c:\Src d:\Dest

    Copy the contents of the c:\Src directory to a directory d:\Dest
    Without the /e or /mir switch, only files from the root of c:\src are copied.

    .EXAMPLE
    C:\PS> .\Copy-ItemWithProgress '"c:\Src Files"' d:\Dest /mir /xf *.log -Verbose

    Copy the contents of the 'c:\Name with Space' directory to a directory d:\Dest
    /mir and /XF parameters are passed to robocopy, and script is run verbose

    .LINK
    https://keithga.wordpress.com/2014/06/23/copy-itemwithprogress

    .NOTES
    By Keith S. Garner (KeithGa@KeithGa.com) - 6/23/2014
    With inspiration by Trevor Sullivan @pcgeek86
    Tweaked by Justin Marshall - 02/20/2020

    #>

    [CmdletBinding()]
    param(
        [Parameter(Mandatory=$true)]
        [string]$Source,
        [Parameter(Mandatory=$true)]
        [string]$Destination,
        [Parameter(Mandatory=$false)]
        [string]$FilesToCopy="*.*",
        [Parameter(Mandatory = $true,ValueFromRemainingArguments=$true)] 
        [string[]] $RobocopyArgs,
        [int]$ParentProgressID=-1,
        [int]$ProgressID=-1
    )

    #handle spaces and trailing slashes
    $SourceDir = '"{0}"' -f ($Source -replace "\\+$","")
    $TargetDir = '"{0}"' -f ($Destination -replace "\\+$","")


    $ScanLog  = [IO.Path]::GetTempFileName()
    $RoboLog  = [IO.Path]::GetTempFileName()
    $ScanArgs = @($SourceDir,$TargetDir,$FilesToCopy) + $RobocopyArgs + "/ndl /TEE /bytes /Log:$ScanLog /nfl /L".Split(" ")
    $RoboArgs = @($SourceDir,$TargetDir,$FilesToCopy) + $RobocopyArgs + "/ndl /TEE /bytes /Log:$RoboLog /NC".Split(" ")

    # Launch Robocopy Processes
    write-verbose ("Robocopy Scan:`n" + ($ScanArgs -join " "))
    write-verbose ("Robocopy Full:`n" + ($RoboArgs -join " "))
    $ScanRun = start-process robocopy -PassThru -WindowStyle Hidden -ArgumentList $ScanArgs
    try
    {
        $RoboRun = start-process robocopy -PassThru -WindowStyle Hidden -ArgumentList $RoboArgs
        try
        {
            # Parse Robocopy "Scan" pass
            $ScanRun.WaitForExit()
            $LogData = get-content $ScanLog
            if ($ScanRun.ExitCode -ge 8)
            {
                $LogData|out-string|Write-Error
                throw "Robocopy $($ScanRun.ExitCode)"
            }
            $FileSize = [regex]::Match($LogData[-4],".+:\s+(\d+)\s+(\d+)").Groups[2].Value
            write-verbose ("Robocopy Bytes: $FileSize `n" +($LogData -join "`n"))
            #determine progress parameters
            $ProgressParms=@{}
            if ($ParentProgressID -ge 0) {
                $ProgressParms['ParentID']=$ParentProgressID
            }
            if ($ProgressID -ge 0) {
                $ProgressParms['ID']=$ProgressID
            } else {
                $ProgressParms['ID']=$RoboRun.Id
            }
            # Monitor Full RoboCopy
            while (!$RoboRun.HasExited)
            {
                $LogData = get-content $RoboLog
                $Files = $LogData -match "^\s*(\d+)\s+(\S+)"
                if ($null -ne $Files )
                {
                    $copied = ($Files[0..($Files.Length-2)] | ForEach-Object {$_.Split("`t")[-2]} | Measure-Object -sum).Sum
                    if ($LogData[-1] -match "(100|\d?\d\.\d)\%")
                    {
                        write-progress Copy -ParentID $ProgressParms['ID'] -percentComplete $LogData[-1].Trim("% `t") $LogData[-1]
                        $Copied += $Files[-1].Split("`t")[-2] /100 * ($LogData[-1].Trim("% `t"))
                    }
                    else
                    {
                        write-progress Copy -ParentID $ProgressParms['ID'] -Complete
                    }
                    write-progress ROBOCOPY  -PercentComplete ($Copied/$FileSize*100) $Files[-1].Split("`t")[-1] @ProgressParms
                }
            }
        } finally {
            if (!$RoboRun.HasExited) {Write-Warning "Terminating copy process with ID $($RoboRun.Id)..."; $RoboRun.Kill() ; }
            $RoboRun.WaitForExit()
            # Parse full RoboCopy pass results, and cleanup
            (get-content $RoboLog)[-11..-2] | out-string | Write-Verbose
            remove-item $RoboLog
            write-output ([PSCustomObject]@{ ExitCode = $RoboRun.ExitCode })

        }
    } finally {
        if (!$ScanRun.HasExited) {Write-Warning "Terminating scan process with ID $($ScanRun.Id)..."; $ScanRun.Kill() }
        $ScanRun.WaitForExit()

        remove-item $ScanLog
    }
}
Justin
  • 1,303
  • 15
  • 30
3

This recursive function copies files and directories recursively from source path to destination path If file already exists on destination path, it copies them only with newer files.

Function Copy-FilesBitsTransfer(
        [Parameter(Mandatory=$true)][String]$sourcePath, 
        [Parameter(Mandatory=$true)][String]$destinationPath, 
        [Parameter(Mandatory=$false)][bool]$createRootDirectory = $true)
{
    $item = Get-Item $sourcePath
    $itemName = Split-Path $sourcePath -leaf
    if (!$item.PSIsContainer){ #Item Is a file

        $clientFileTime = Get-Item $sourcePath | select LastWriteTime -ExpandProperty LastWriteTime

        if (!(Test-Path -Path $destinationPath\$itemName)){
            Start-BitsTransfer -Source $sourcePath -Destination $destinationPath -Description "$sourcePath >> $destinationPath" -DisplayName "Copy Template file" -Confirm:$false
            if (!$?){
                return $false
            }
        }
        else{
            $serverFileTime = Get-Item $destinationPath\$itemName | select LastWriteTime -ExpandProperty LastWriteTime

            if ($serverFileTime -lt $clientFileTime)
            {
                Start-BitsTransfer -Source $sourcePath -Destination $destinationPath -Description "$sourcePath >> $destinationPath" -DisplayName "Copy Template file" -Confirm:$false
                if (!$?){
                    return $false
                }
            }
        }
    }
    else{ #Item Is a directory
        if ($createRootDirectory){
            $destinationPath = "$destinationPath\$itemName"
            if (!(Test-Path -Path $destinationPath -PathType Container)){
                if (Test-Path -Path $destinationPath -PathType Leaf){ #In case item is a file, delete it.
                    Remove-Item -Path $destinationPath
                }

                New-Item -ItemType Directory $destinationPath | Out-Null
                if (!$?){
                    return $false
                }

            }
        }
        Foreach ($fileOrDirectory in (Get-Item -Path "$sourcePath\*"))
        {
            $status = Copy-FilesBitsTransfer $fileOrDirectory $destinationPath $true
            if (!$status){
                return $false
            }
        }
    }

    return $true
}
codemonkee
  • 2,881
  • 1
  • 25
  • 32
Dudi72
  • 59
  • 5
2

Sean Kearney from the Hey, Scripting Guy! Blog has a solution I found works pretty nicely.

Function Copy-WithProgress
{
    [CmdletBinding()]
    Param
    (
        [Parameter(Mandatory=$true,
            ValueFromPipelineByPropertyName=$true,
            Position=0)]
        $Source,
        [Parameter(Mandatory=$true,
            ValueFromPipelineByPropertyName=$true,
            Position=0)]
        $Destination
    )

    $Source=$Source.tolower()
    $Filelist=Get-Childitem "$Source" –Recurse
    $Total=$Filelist.count
    $Position=0

    foreach ($File in $Filelist)
    {
        $Filename=$File.Fullname.tolower().replace($Source,'')
        $DestinationFile=($Destination+$Filename)
        Write-Progress -Activity "Copying data from '$source' to '$Destination'" -Status "Copying File $Filename" -PercentComplete (($Position/$total)*100)
        Copy-Item $File.FullName -Destination $DestinationFile
        $Position++
    }
}

Then to use it:

Copy-WithProgress -Source $src -Destination $dest
E-rich
  • 9,243
  • 11
  • 48
  • 79
  • 2
    This will report the number of files copied in `$Filelist`, whereas the question is asking how to report the progress of copying a single file (i.e. the count of bytes/blocks copied thus far). If this code were used to copy a single, large file it would give no indication how far the copy operation has progressed within that file. From the question body: "There are solutions out there to use Write-Progress in conjunction with looping to copy many files and display progress. However I can't seem to find anything that would show progress of a single file." – Lance U. Matthews May 16 '18 at 01:40
1

This is an old post, but I thought it might help others.
The solution with FileStreams is elegant and it works, but very slow.
I think using other programs, like robocopy.exe defeats powershell's purpose.
That was even one of the motivations in the Monad manifest.
So I cracked open the Copy-Item cmdlet from Microsoft.PowerShell.Management, and at the end, it calls CopyFileEx, from kernel32.dll.

On CopyFileEx signature, there's a parameter that accepts a callback to provide progress information.
On pinvoke.net there's a great example on how to marshal this function and the callback into a delegate.
I modified slightly so we can provide the delegate from the PS script itself.

And believe me when I say this.
I was not expecting this to work :D (I literally jumped from the chair).

And it's considerably faster.
Here's the code:

function Copy-File {

    [CmdletBinding()]
    param (
        [Parameter(Mandatory, Position = 0)]
        [string]$Path,

        [Parameter(Mandatory, Position = 1)]
        [string]$Destination
    )

    $signature = @'
    namespace Utilities {

        using System;
        using System.Runtime.InteropServices;
    
        public class FileSystem {
            
            [DllImport("kernel32.dll", SetLastError = true, CharSet = CharSet.Auto)]
            [return: MarshalAs(UnmanagedType.Bool)]
            static extern bool CopyFileEx(
                string lpExistingFileName,
                string lpNewFileName,
                CopyProgressRoutine lpProgressRoutine,
                IntPtr lpData,
                ref Int32 pbCancel,
                CopyFileFlags dwCopyFlags
            );
        
            delegate CopyProgressResult CopyProgressRoutine(
            long TotalFileSize,
            long TotalBytesTransferred,
            long StreamSize,
            long StreamBytesTransferred,
            uint dwStreamNumber,
            CopyProgressCallbackReason dwCallbackReason,
            IntPtr hSourceFile,
            IntPtr hDestinationFile,
            IntPtr lpData);
        
            int pbCancel;
        
            public enum CopyProgressResult : uint
            {
                PROGRESS_CONTINUE = 0,
                PROGRESS_CANCEL = 1,
                PROGRESS_STOP = 2,
                PROGRESS_QUIET = 3
            }
        
            public enum CopyProgressCallbackReason : uint
            {
                CALLBACK_CHUNK_FINISHED = 0x00000000,
                CALLBACK_STREAM_SWITCH = 0x00000001
            }
        
            [Flags]
            enum CopyFileFlags : uint
            {
                COPY_FILE_FAIL_IF_EXISTS = 0x00000001,
                COPY_FILE_RESTARTABLE = 0x00000002,
                COPY_FILE_OPEN_SOURCE_FOR_WRITE = 0x00000004,
                COPY_FILE_ALLOW_DECRYPTED_DESTINATION = 0x00000008
            }
        
            public void CopyWithProgress(string oldFile, string newFile, Func<long, long, long, long, uint, CopyProgressCallbackReason, System.IntPtr, System.IntPtr, System.IntPtr, CopyProgressResult> callback)
            {
                CopyFileEx(oldFile, newFile, new CopyProgressRoutine(callback), IntPtr.Zero, ref pbCancel, CopyFileFlags.COPY_FILE_RESTARTABLE);
            }
        }
    }
'@

    Add-Type -TypeDefinition $signature

    [Func[long, long, long, long, System.UInt32, Utilities.FileSystem+CopyProgressCallbackReason, System.IntPtr, System.IntPtr, System.IntPtr, Utilities.FileSystem+CopyProgressResult]]$copyProgressDelegate = {

        param($total, $transfered, $streamSize, $streamByteTrans, $dwStreamNumber, $reason, $hSourceFile, $hDestinationFile, $lpData)

        Write-Progress -Activity "Copying file" -Status "$Path ~> $Destination. $([Math]::Round(($transfered/1KB), 2))KB/$([Math]::Round(($total/1KB), 2))KB." -PercentComplete (($transfered / $total) * 100)
    }

    $fileName = [System.IO.Path]::GetFileName($Path)
    $destFileName = [System.IO.Path]::GetFileName($Destination)
    if ([string]::IsNullOrEmpty($destFileName) -or $destFileName -notlike '*.*') {
        if ($Destination.EndsWith('\')) {
            $destFullName = "$Destination$fileName"
        }
        else {
            $destFullName = "$Destination\$fileName"
        }
    }

    $wrapper = New-Object Utilities.FileSystem
    $wrapper.CopyWithProgress($Path, $destFullName, $copyProgressDelegate)
}

Hope it helps.
Happy scripting!

Update:

Same thing, but using CopyFile2.

try {
    Add-Type -TypeDefinition @'
namespace Utilities {
    using System;
    using System.Text;
    using System.Runtime.InteropServices;

    public delegate COPYFILE2_MESSAGE_ACTION CopyFile2ProgressRoutine(
        [In] COPYFILE2_MESSAGE pMessage,
        [In, Optional] IntPtr pvCallbackContext
    );

    [Flags]
    public enum CopyFlags : uint {
        COPY_FILE_FAIL_IF_EXISTS = 0x00000001,
        COPY_FILE_RESTARTABLE = 0x00000002,
        COPY_FILE_OPEN_SOURCE_FOR_WRITE = 0x00000004,
        COPY_FILE_ALLOW_DECRYPTED_DESTINATION = 0x00000008,
        COPY_FILE_COPY_SYMLINK = 0x00000800,
        COPY_FILE_NO_BUFFERING = 0x00001000,
        COPY_FILE_REQUEST_SECURITY_PRIVILEGES = 0x00002000,
        COPY_FILE_RESUME_FROM_PAUSE = 0x00004000,
        COPY_FILE_NO_OFFLOAD = 0x00040000,
        COPY_FILE_REQUEST_COMPRESSED_TRAFFIC = 0x10000000
    }
    
    public enum COPYFILE2_MESSAGE_ACTION : uint {
        COPYFILE2_PROGRESS_CONTINUE,
        COPYFILE2_PROGRESS_CANCEL,
        COPYFILE2_PROGRESS_STOP,
        COPYFILE2_PROGRESS_QUIET,
        COPYFILE2_PROGRESS_PAUSE
    }

    public enum COPYFILE2_MESSAGE_TYPE : uint {
        COPYFILE2_CALLBACK_NONE,
        COPYFILE2_CALLBACK_CHUNK_STARTED,
        COPYFILE2_CALLBACK_CHUNK_FINISHED,
        COPYFILE2_CALLBACK_STREAM_STARTED,
        COPYFILE2_CALLBACK_STREAM_FINISHED,
        COPYFILE2_CALLBACK_POLL_CONTINUE,
        COPYFILE2_CALLBACK_ERROR,
        COPYFILE2_CALLBACK_MAX
    }

    public enum COPYFILE2_COPY_PHASE : uint {
        COPYFILE2_PHASE_NONE,
        COPYFILE2_PHASE_PREPARE_SOURCE,
        COPYFILE2_PHASE_PREPARE_DEST,
        COPYFILE2_PHASE_READ_SOURCE,
        COPYFILE2_PHASE_WRITE_DESTINATION,
        COPYFILE2_PHASE_SERVER_COPY,
        COPYFILE2_PHASE_NAMEGRAFT_COPY,
        COPYFILE2_PHASE_MAX
    }

    [StructLayout(LayoutKind.Sequential)]
    public struct ULARGE_INTEGER {
        public Int64 QuadPart;
    }

    [StructLayout(LayoutKind.Sequential)]
    public struct _ChunkStarted {
        public uint          dwStreamNumber;
        public uint          dwReserved;
        public IntPtr         hSourceFile;
        public IntPtr         hDestinationFile;
        public ULARGE_INTEGER uliChunkNumber;
        public ULARGE_INTEGER uliChunkSize;
        public ULARGE_INTEGER uliStreamSize;
        public ULARGE_INTEGER uliTotalFileSize;
    }

    [StructLayout(LayoutKind.Sequential)]
    public struct _ChunkFinished {
        public uint          dwStreamNumber;
        public uint          dwFlags;
        public IntPtr         hSourceFile;
        public IntPtr         hDestinationFile;
        public ULARGE_INTEGER uliChunkNumber;
        public ULARGE_INTEGER uliChunkSize;
        public ULARGE_INTEGER uliStreamSize;
        public ULARGE_INTEGER uliStreamBytesTransferred;
        public ULARGE_INTEGER uliTotalFileSize;
        public ULARGE_INTEGER uliTotalBytesTransferred;
    }

    [StructLayout(LayoutKind.Sequential)]
    public struct _StreamStarted {
        public uint          dwStreamNumber;
        public uint          dwReserved;
        public IntPtr         hSourceFile;
        public IntPtr         hDestinationFile;
        public ULARGE_INTEGER uliStreamSize;
        public ULARGE_INTEGER uliTotalFileSize;
    }

    [StructLayout(LayoutKind.Sequential)]
    public struct _StreamFinished {
        public uint          dwStreamNumber;
        public uint          dwReserved;
        public IntPtr         hSourceFile;
        public IntPtr         hDestinationFile;
        public ULARGE_INTEGER uliStreamSize;
        public ULARGE_INTEGER uliStreamBytesTransferred;
        public ULARGE_INTEGER uliTotalFileSize;
        public ULARGE_INTEGER uliTotalBytesTransferred;
    }

    [StructLayout(LayoutKind.Sequential)]
    public struct _PollContinue {
        public uint dwReserved;
    }

    [StructLayout(LayoutKind.Sequential)]
    public struct _Error {
        COPYFILE2_COPY_PHASE CopyPhase;
        uint                dwStreamNumber;
        IntPtr              hrFailure;
        uint                dwReserved;
        ULARGE_INTEGER       uliChunkNumber;
        ULARGE_INTEGER       uliStreamSize;
        ULARGE_INTEGER       uliStreamBytesTransferred;
        ULARGE_INTEGER       uliTotalFileSize;
        ULARGE_INTEGER       uliTotalBytesTransferred;
    }

    [StructLayout(LayoutKind.Explicit)]
    public struct COPYFILE2_MESSAGE {
        [FieldOffset(0)]
        public COPYFILE2_MESSAGE_TYPE Type;

        [FieldOffset(1)]
        public uint dwPadding;

        [FieldOffset(2)]
        public _ChunkStarted ChunkStarted;

        [FieldOffset(2)]
        public _ChunkFinished ChunkFinished;

        [FieldOffset(2)]
        public _StreamStarted StreamStarted;

        [FieldOffset(2)]
        public _StreamFinished StreamFinished;

        [FieldOffset(2)]
        public _PollContinue PollContinue;

        [FieldOffset(2)]
        public _Error Error;
    }

    [StructLayout(LayoutKind.Sequential)]
    public struct COPYFILE2_EXTENDED_PARAMETERS {
        public uint dwSize;
        public CopyFlags dwCopyFlags;
        public bool pfCancel;
        public CopyFile2ProgressRoutine pProgressRoutine;
        public IntPtr pvCallbackContext;
    }

    public class FileSystem {

        [DllImport("kernel32.dll", SetLastError = true, CharSet = CharSet.Unicode)]
        public static extern uint CopyFile2(
            string pwszExistingFileName,
            string pwszNewFileName,
            COPYFILE2_EXTENDED_PARAMETERS pExtendedParameters
        );

        public static void CopyFileEx(string filePath, string destination, Func<COPYFILE2_MESSAGE, IntPtr, COPYFILE2_MESSAGE_ACTION> callback) {
            COPYFILE2_EXTENDED_PARAMETERS extParams = new();
            extParams.dwSize = Convert.ToUInt32(Marshal.SizeOf(extParams));
            extParams.dwCopyFlags = CopyFlags.COPY_FILE_NO_BUFFERING | CopyFlags.COPY_FILE_COPY_SYMLINK;
            extParams.pProgressRoutine = new CopyFile2ProgressRoutine(callback);
            extParams.pvCallbackContext = IntPtr.Zero;

            uint result = CopyFile2(filePath, destination, extParams);
            if (result != 0)
                throw new SystemException(result.ToString());
        }
    }
}
'@
}
catch { }

[Func[
    Utilities.COPYFILE2_MESSAGE,
    IntPtr,
    Utilities.COPYFILE2_MESSAGE_ACTION
]]$delegate = {

    param([Utilities.COPYFILE2_MESSAGE]$message, $extArgs, $result)

    if ($message.Type -eq [Utilities.COPYFILE2_MESSAGE_TYPE]::COPYFILE2_CALLBACK_CHUNK_FINISHED) {
        Write-Progress -Activity 'Copying file.' -Status 'Copying...' -PercentComplete (($message.ChunkFinished.uliTotalFileSize.QuadPart / $message.ChunkFinished.uliStreamBytesTransferred.QuadPart) * 100)
    }
}

if (Test-Path -Path C:\CopyFile2TestDestination -PathType Container) { [void](mkdir C:\CopyFile2TestDestination) }
[Utilities.FileSystem]::CopyFileEx('C:\superTest.dat', 'C:\CopyFile2TestDestination\superTestCopy.dat', $delegate)
FranciscoNabas
  • 505
  • 3
  • 9
  • Looks great, but i just learned about CopyFileEx not supporting the source being a directory. Is CopyFile2 any better in this regard? – gth May 09 '23 at 00:19
  • 1
    Indeed, my original idea is to manage directories directly, using either 'New-Item' or 'System.IO.Directory'. CopyFile2 does not support a progress callback, but it's a good source. – FranciscoNabas May 09 '23 at 00:57
  • Your code is the best example I've found for CopyFileEx - I do appreciate it! Looks like [CopyFile2](https://learn.microsoft.com/en-us/windows/win32/api/winbase/nf-winbase-copyfile2) supports a [progress](https://learn.microsoft.com/en-us/windows/win32/api/winbase/ns-winbase-copyfile2_extended_parameters) [routine](https://learn.microsoft.com/en-us/windows/win32/api/winbase/nc-winbase-pcopyfile2_progress_routine). I might try and call CopyFileEx recursively whenever I detect a directory as a source item, and keep drilling down. Hopefully won't hit any limitations. – gth May 10 '23 at 22:09
  • Interesting! I'll have to try it. Thanks! – FranciscoNabas May 11 '23 at 18:15
  • 1
    Yes!!! It werkss. https://github.com/FranciscoNabas/PowerShellPublic/blob/main/CopyFile2.ps1 For some weird reason ChunkFinished.uliTotalBytesTransferred and ChunkFinished.uliTotalFileSize are backwards. I must have missed something while marshaling the union. – FranciscoNabas May 11 '23 at 22:17
0

Trevor Sullivan has a write-up on how to add a command called Copy-ItemWithProgress to PowerShell on Robocopy.

wonea
  • 4,783
  • 17
  • 86
  • 139
Wouter
  • 2,170
  • 1
  • 28
  • 58