1

I have a script partially based on the one here: Upload files with FTP using PowerShell

It all works absolutely fine with tiny files but I am trying to use it to make the process we use for exporting access mdb files to clients that only have ftp more robust.

My first test involved a 10MB file and I ran into a System.OutOfMemoryException at the Get-Content stage

The powershell ISE was running to nearly 2GIG usage during the get attempt.

Here is a full script sample (Be gentle. I am fairly new to it):

#####
# User variables to control the script
#####

# How many times connection will be re-tried
$connectionTries = 5
#time between tries in seconds
$connectionTryInterval = 300
#Where to log the output
$logFile = "D:\MyPath\ftplog.txt"
#maximum log file size in KB before it is archived
$logFileMaxSize = 500

#formatted date part for the specific file to transfer
#This is appended to the filename base. Leave as "" for none
$datePart = ""
#base part of the file name
$fileNameBase = "Myfile"
#file extension
$fileExtension = ".mdb"
#location of the source file (please include trailing backslash)
$sourceLocation = "D:\MyPath\"

#location and credentials of the target ftp server    
$userName = "iamafish"
$password = "ihavenofingers"
$ftpServer = "10.0.1.100"

######
# Main Script
#####

#If there is a log file and it is longer than the declared limit then archive it with  the current timestamp
if (test-path $logfile)
{
    if( $((get-item $logFile).Length/1kb) -gt $logFileMaxSize)
    {
        write-host $("archiving log to ftplog_" + (get-date -format yyyyMMddhhmmss) +     ".txt")
        rename-item $logFile $("ftplog_" + (get-date -format yyyyMMddhhmmss) + ".txt")
    }
}

#start new log entry
#Add-Content $logFile "___________________________________________________________"
#write-host $logEntry

#contruct source file and destination uri
$fileName = $fileNameBase + $datePart + $fileExtension
$sourceFile = $sourceLocation + $fileName
$sourceuri = "ftp://" + $ftpServer + "/" + $fileName


# Create a FTPWebRequest object to handle the connection to the ftp server
$ftprequest = [System.Net.FtpWebRequest]::create($sourceuri)

# set the request's network credentials for an authenticated connection
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)

$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftprequest.UseBinary = $true
$ftprequest.KeepAlive = $false

$succeeded = $true
$errorMessage = ""

# read in the file to upload as a byte array
trap [exception]{
    $script:succeeded = $false
    $script:errorMessage = $_.Exception.Message
    Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|" +    $_.Exception.Message)
    #write-host $logEntry
    #write-host $("TRAPPED: " + $_.Exception.GetType().FullName)
    #write-host $("TRAPPED: " + $_.Exception.Message)
    exit
}
#The -ea 1 forces the error to be trappable
$content = gc -en byte $sourceFile -ea 1


$try = 0

do{
    trap [System.Net.WebException]{
        $script:succeeded = $false
        $script:errorMessage = $_.Exception.Message
        Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|" +    $_.Exception.Message)
        #write-host $logEntry
        #write-host $("TRAPPED: " + $_.Exception.GetType().FullName)
        $script:try++
        start-sleep -s $connectionTryInterval
        continue
        }
        $ftpresponse = $ftprequest.GetResponse()

} while(($try -le $connectionTries) -and (-not $succeeded))

if ($succeeded) { 

    Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|0|" +    "Starting file transfer.")
    # get the request stream, and write the bytes into it
    $rs = $ftprequest.GetRequestStream()
    $rs.Write($content, 0, $content.Length)
    # be sure to clean up after ourselves
    $rs.Close()
    $rs.Dispose()
    $content.Close()
    $content.Dispose()
    Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|0|" +    "Transfer complete.")
    #write-host $logEntry
}

I can't put code in comments so, thanks to pointers from keith I have moved the file acces bit down to the bottom to link it with the other like so..

trap [Exception]{
    $script:succeeded = $false
    $script:errorMessage = $_.Exception.Message
    Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|Check File Connection|" + $_.Exception.Message)
    $sourceStream.Close()
    $sourceStream.Dispose()
    #write-host $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|Attempt to open file|" + $_.Exception.Message)
    #write-host $("TRAPPED: " + $_.Exception.GetType().FullName)
    exit
}
$sourceStream = New-Object IO.FileStream ($(New-Object System.IO.FileInfo $sourceFile),[IO.FileMode]::Open)
[byte[]]$readbuffer = New-Object byte[] 1024

# get the request stream, and write the bytes into it
$rs = $ftprequest.GetRequestStream()
do{
    $readlength = $sourceStream.Read($readbuffer,0,1024)
    $rs.Write($readbuffer,0,$readlength)
} while ($readlength -ne 0)

I just need to work out why I get: Exception calling "GetResponse" with "0" argument(s): "Cannot access a disposed object. every other time I run it. Is this a quirk of running it in the ISE or am I doing somethign drasically wrong with either initial declaration or final disposing?

I'll post the full final script when done since I think it will make a nice sturdy ftp export example with error trapping and logging.


OK, here is the full script. Dispose is edited out but with or without it runnign the script within 5 minutes will either get me a message that I cannot use a disposed opject or tell me that the getResponse() has produced an error (226) File transfered (running in ISE). Whilst this will not be a problem during normal opperation I would like to correctly log oout of the FTP session and clean the resources at the end of the script and ensure I am correctly declaring them as needed.

#####
# User variables to control the script
#####

# How many times connection will be re-tried
$connectionTries = 5
#time between tries in seconds
$connectionTryInterval = 1
#Where to log the output
$logFile = "D:\MyPath\ftplog.txt"
#maximum log file size in KB before it is archived
$logFileMaxSize = 500
#log to file or console - #true=log to file, #false = log to console
$logToFile=$false

#formatted date part for the specific file to transfer
#This is appended to the filename base. Leave as "" for none
$datePart = ""
#base part of the file name
$fileNameBase = "MyFile"
#file extension
$fileExtension = ".mdb"
#location of the source file (please include trailing backslash)
$sourceLocation = "D:\MyPath\"

#location and credentials of the target ftp server
$userName = "iamafish"
$password = "ihavenofingers"
$ftpServer = "10.0.1.100"

######
# Main Script
#####

function logEntry($entryType, $section, $message)
{
    #just to make a one point switch for logging to console for testing
    # $entryType: 0 = success, 1 = Error
    # $section: The section of the script the log entry was generated from
    # $message: the log message

    #This is pipe separated to fit in with my standard MSSQL linked flat file schema for easy querying
    $logString = "$(get-Date -format "yyyy-MM-dd hh:mm:ss")|$entryType|$section|$message"

    if($script:logtoFile)
    {
        Add-Content $logFile $logString
    }
    else
    {
        write-host $logString
    }
}

#If there is a log file and it is longer than the declared limit then archive it with the current timestamp
if (test-path $logfile)
{
    if( $((get-item $logFile).Length/1kb) -gt $logFileMaxSize)
    {
        write-host $("archiving log to ftplog_" + (get-date -format yyyyMMddhhmmss) + ".txt")
        rename-item $logFile $("ftplog_" + (get-date -format yyyyMMddhhmmss) + ".txt")
        New-Item $logFile -type file
    }
}
else
{
    New-Item $logFile -type file
}


#contruct source file and destination uri
$fileName = $fileNameBase + $datePart + $fileExtension
$sourceFile = $sourceLocation + $fileName
$destination = "ftp://" + $ftpServer + "/" + $fileName


#Check if the source file exists
if ((test-path $sourceFile) -eq $false)
{
    logEntry 1 "Check Source File" $("File not found: " + $sourceFile)
    Exit
}


# Create a FTPWebRequest object to handle the connection to the ftp server
$ftpRequest = [System.Net.FtpWebRequest]::create($destination)

# set the request's network credentials for an authenticated connection
$ftpRequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$ftpRequest.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftpRequest.UseBinary = $true
$ftpRequest.KeepAlive = $false

$succeeded = $true
$try = 1

do{
    trap [Exception]{
        $script:succeeded = $false
        logEntry 1 "Check FTP Connection" $_.Exception.Message
        $script:try++
        start-sleep -s $connectionTryInterval
        continue
        }
        $ftpResponse = $ftpRequest.GetResponse()

} while(($try -le $connectionTries) -and (-not $succeeded))

if ($succeeded) {
    logEntry 0 "Connection to FTP" "Success"


    # Open a filestream to the source file
    trap [Exception]{
        logEntry 1 "Check File Connection" $_.Exception.Message
        $sourceStream.Close()
        $ftpResponse.Close()
        exit
    }
    $sourceStream = New-Object IO.FileStream ($(New-Object System.IO.FileInfo $sourceFile),[IO.FileMode]::Open)
    [byte[]]$readbuffer = New-Object byte[] 1024

    logEntry 0 "Starting file transfer" "Success"
    # get the request stream, and write the bytes into it
    $rs = $ftpRequest.GetRequestStream()
    do{
        $readlength = $sourceStream.Read($readbuffer,0,1024)
        $rs.Write($readbuffer,0,$readlength)
    } while ($readlength -ne 0)

    logEntry 0 "Transfer complete" "Success"
    # be sure to clean up after ourselves
    $rs.Close()
    #$rs.Dispose()
    $sourceStream.Close()
    #$sourceStream.Dispose()

}
$ftpResponse.Close()

Example of trying to trap the Transfer OK response at the end:

logEntry 0 "Starting file transfer" "Success"
# get the request stream, and write the bytes into it
$rs = $ftpRequest.GetRequestStream()
do{
    $readlength = $sourceStream.Read($readbuffer,0,1024)
    $rs.Write($readbuffer,0,$readlength)
} while ($readlength -ne 0)
$rs.Close()
#start-sleep -s 2

trap [Exception]{
    $script:succeeded = $false
    logEntry 1 "Check FTP Connection" $_.Exception.Message
    continue
}
$ftpResponse = $ftpRequest.GetResponse()
Community
  • 1
  • 1
Bodestone
  • 54
  • 1
  • 2
  • 8

2 Answers2

3

Having hit a similar issue myself with RAM usage hitting the GB's uploading a 3MB file, I found that replacing:

 $content = gc -en byte $sourceFile

With:

 $content = [System.IO.File]::ReadAllBytes($sourceFile)

Gives much better performance. As mentioned elsewhere, chunking would be a better solution for really large files, as then you're not holding the whole file in memory at once, but the code above at least only consumes ~(size of file) bytes of RAM, which means it should be good up to the ~10s of MB kind of range.

rmc47
  • 1,364
  • 1
  • 13
  • 17
0

Rather than read the whole file into memory using Get-Content, try reading it in a chunk at a time and writing it to the FTP request stream. I would use one of the lower level .NET file stream APIs to do the reading. Admittedly, you wouldn't think a 10MB would pose a memory problem though.

Also, make sure you get the response after geting the request stream and writing to it. The get of the response stream is what uploads the data. From the docs:

When using an FtpWebRequest object to upload a file to a server, you must write the file content to the request stream obtained by calling the GetRequestStream method or its asynchronous counterparts, the BeginGetRequestStream and EndGetRequestStream methods. You must write to the stream and close the stream before sending the request.

Requests are sent to the server by calling the GetResponse method or its asynchronous counterparts, the BeginGetResponse and EndGetResponse methods. When the requested operation completes, an FtpWebResponse object is returned. The FtpWebResponse object provides the status of the operation and any data downloaded from the server.

Keith Hill
  • 194,368
  • 42
  • 353
  • 369
  • Reading it in as a byte array seems to be the memory killer. – mjolinor Feb 17 '11 at 16:35
  • @keith-hill Thanks. I did a lot of looking around and modified the script as above. now though when I run it the first time it works but when I run it a second time I get Exception calling "GetResponse" with "0" argument(s): "Cannot access a disposed object. Each alternate time it runs then errors. Maybe I am not understanding hwo dispose actually works. My .net is somewhat lacking. – Bodestone Feb 17 '11 at 20:28
  • I'm not quite following your code, especially with the updated part and how it correlates to the rest of the code. In the code at the top it appears you get the response (which submits the data) before you write the data to the request stream. – Keith Hill Feb 17 '11 at 22:05
  • The probelms seem to go away if I leave things 5 minutes before re-running. Just fighting with some wierd string concatenation issue just now before putting out a fully updated one. $logstring = $(get-Date -format "yyyy-MM-dd hh:mm:ss") + "|" + ($type) + "|" + ($section) + "|" + ($message) -variables are 1,2,3 respectively and the string is output as 2011-02-17 10:15:49|1 2 3|| – Bodestone Feb 17 '11 at 22:22
  • OK, I was being daft with my string concatenation. Tha was fine. I was just calling the function wrong. Roo used to other lanquages and using myFunction (1,2,3) instead of myFunction 1 2 3. Done just a bit more tidying on the script and defaulted it to log to console. – Bodestone Feb 18 '11 at 00:31
  • I've just read your adendum. I thought Hmm, OK I have to use GetResponse() after $rs.close() and trap it to I can check if it is 226. I used identical code to the trap for the first GetResponse() removing the looping for multiple connection tries. It trapped nothing but if I executed the script again straight away then the first one in the script trapped it as "The remote server returned an error: 226 Transfer OK" – Bodestone Feb 18 '11 at 01:00
  • Sorry if this is wandering off topic slightly but I do think one place of the issues that may be involved would be useful for others. It certainly would have been for me. I actually now have several ideas of different things to search on to resolve some of the final issues myself but it is past my bed time. – Bodestone Feb 18 '11 at 01:12
  • @keith-hill Definitely you have answered the question originally asked. I still keep running into tother issues along the way. The error now is usually "Unable to write data to the transport connection: An established connection was aborted by the software in your host machine." this is only for large files to remote servers. Large files (only >5MB) to local servers work fine so I am not sure if it is timeout ot memory but looking up it seems to be .net specific and something MS will not discuss in public, rather require you to send them net traffic logs. This may be something for another post – Bodestone Feb 20 '11 at 01:51