2

I have a PowerShell script which I'm using to FTP a 7.3 Gb file, in chunks, from US to Europe. The script works fine until, occasionally, the connection drops and the transfer stops.

I have tried multiple versions with try-catch and a retry counter so that if the connection fails it retries to send the same chunk.

This usually works but the problem is when the connection drops, if the connection managed to send half the chunk to the FTP location and then the connection fails, it retries to send the entire chunk once again so when it finishes I can end up with a larger file sent.

This larger file is also a corrupted / invalid file since it's just a heap with more bits and bytes than it's supposed to have.

The piece of code I'm using to FTP the files is:

# FTP connection details
$ftp_addr = "ftp://ftp.example.com/Backups/"
$user = "abc"
$pass = "1234"

$bufSize = 256mb

# some more irrelevant code here where I identify files to be FTP'ed etc.
# ......

# Initialize connection to FTP
$ftp = [System.Net.FtpWebRequest]::Create($destination_filename+".zip")
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential($user, $pass)

$ftp.Timeout = -1              #infinite timeout
$ftp.ReadWriteTimeout = -1     #infinite timeout

$ftp.UseBinary = $true
$ftp.UsePassive = $true

$requestStream = $ftp.GetRequestStream()
$fileStream = [System.IO.File]::OpenRead($backup_target_app_data)
$chunk = New-Object byte[] $bufSize

try {

    while( $bytesRead = $fileStream.Read($chunk,0,$bufsize) )
    {
        $retryCount = 0

        while ($retryCount -ne 30) {
            try {
                    $requestStream.write($chunk, 0, $bytesRead)
                    $requestStream.Flush()
                }
            catch { 
                    $retryCount += 0 
                }
        }
    }

    $FileStream.Close()
    $requestStream.Close()

Now, I'm not sure how I can manage this. I've been thinking about shrinking my chunk size from 256mb to the size of a TCP packet, but as far as I know those can vary as well (up to 64Kb).

So, I'm looking for a way to handle this connection drop somehow, since I'm not sure how to do this at the moment. Any help is really appreciated.

Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992
Radu Gheorghiu
  • 20,049
  • 16
  • 72
  • 107

1 Answers1

6

The only way to resume transfer after a connection is interrupted with FtpWebRequest, is to reconnect and start writing to the end of the file.

For that use FtpWebRequest.ContentOffset.

A related question for upload with full code (although for C#):
How to download FTP files with automatic resume in case of disconnect


Or use an FTP library that can resume the transfer automatically.

For example WinSCP .NET assembly does. With it, a resumable upload is as trivial as:

# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"

# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property @{
    Protocol = [WinSCP.Protocol]::Ftp
    HostName = "ftp.example.com"
    UserName = "user"
    Password = "mypassword"
}

$session = New-Object WinSCP.Session

# Connect
$session.Open($sessionOptions)

# Resumable upload
$session.PutFileToDirectory("C:\path\file.zip", "/home/user")

(I'm the author of WinSCP)

Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992