1

Background: Working through different techniques to retrieve remote directory information over FTP within PowerShell.

The script below seems to be a popular way of retrieving remote directory details using the 'off-the-shelf' FTP ListDirectoryDetails available in PowerShell. (link1, link2, link3) and it works well.

$server = "ftp://servername"
$ftp = [system.net.ftpwebrequest] [system.net.webrequest]::create($server)
$ftp.method = [system.net.WebRequestMethods+ftp]::listdirectorydetails
$response = $ftp.getresponse()
$stream = $response.getresponsestream()

  $buffer = new-object System.Byte[] 1024 
  $encoding = new-object System.Text.AsciiEncoding 

  $outputBuffer = "" 
  $foundMore = $false 

  ## Read all the data available from the stream, writing it to the 
  ## output buffer when done. 
  do 
  { 
    ## Allow data to buffer for a bit 
    start-sleep -m 1000 

    ## Read what data is available 
    $foundmore = $false 
    $stream.ReadTimeout = 1000 

    do 
    { 
      try 
      { 
        $read = $stream.Read($buffer, 0, 1024) 

        if($read -gt 0) 
        { 
          $foundmore = $true 
          $outputBuffer += ($encoding.GetString($buffer, 0, $read)) 
        } 
      } catch { $foundMore = $false; $read = 0 } 
    } while($read -gt 0) 
  } while($foundmore) 

  $outputBuffer 

As the do loop starts, a delay is set. When multiple folders are being retrieved, the delay is multiplied by the number of folders. So the shorter the delay, the faster the download. I dropped it to 100, a ten-fold improvement and it worked fine. But not being a guru in these things, I don't really understand the need for the delay, nor the consequences for having insufficient delay.

Can someone explain the mechanism we're dealing with here?

Alan
  • 1,587
  • 3
  • 23
  • 43

1 Answers1

1

You do not need any delay. It does not make any sense.

Actually, quite on the contrary. With a delay, the server may close the connection, as it may take you too long to read a response.


And that code is insanely complicated for what it does.

This will do the same:

$url = "ftp://username:password@ftp.example.com/remote/path/"
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails

$response = $request.GetResponse()
$reader = New-Object System.IO.StreamReader($response.GetResponseStream())
$outputBuffer = $reader.ReadToEnd()

If you need to read the response line-by-line, the code can still be a lot simpler than in your question. And I've already posted you such code in my answer to your previous question Can PowerShell use FTP to retrieve remote folder and subfolder directory data in a single transmission?

Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992
  • Wow, I did not recognize what was going on in the previous post and should have studied it closer; such is the path we sometimes take; appreciate you pointing that out. As I commented on one of our previous exchanges, I am working through this step by step to make sure I fully understand what's happening. I do fully expect to end up at WinSCP, but I will get there step by step. Will also look more closely at above and the other post and redo my script testing for ListDirectoryDetails. Thanks for confirming the delay doesn't make sense in the code mentioned. – Alan Dec 01 '17 at 21:50
  • Following up. I was finally able to code a complete script using your samples above and at the other link you noted. You were a great help - thx. Now that I see ListDirectoryDetails fully working, I'd like to ask if `WinSCPnet.dll` is PowerShell script or some other compiled module. Does it retrieve directory data significantly faster than ListDirectoryDetails? – Alan Dec 07 '17 at 00:17
  • 1
    @Alan `WinSCPnet.dll` is C# assembly that runs `winscp.exe` scripting interface. See https://winscp.net/eng/docs/library#purpose - No, I do not think WinSCP would be significantly faster than `FtpWebRequest`, as both do the same under the hood. – Martin Prikryl Dec 07 '17 at 07:43