1

We have two custom indices being calculated by S&P. S&P uploads 8 files (4 for each index) to an FTP server folder that I can access. They are uploaded every evening, and I then retrieve them the following morning. I want this to be automated because they only maintain 5 days on the server at a time.

Here are example file names:

20160805_LUKDGUP
20160805_LUKDGUP_NCS
20160805_LUKDGUP_NCS_ADJ
20160805_LUKDGUP_NCS_CLS
20160805_LUKSGUP
20160805_LUKSGUP_NCS
20160805_LUKSGUP_NCS_ADJ
20160805_LUKSGUP_NCS_CLS

What I am trying to do is download them to our local server, have them automatically put into folders based on file name → 20160805_LUKDGUP_* files go into a folder LUKDGUP\20160805\ and the same for LUKSGUP files.

I've tried modifying some batch files from other posts, but think I keep making small mistakes that prevent them from working.

Edit 20160809-0841: I have been trying to modify this to suit my needs, but haven't been able to follow the loop that well:

setlocal
set "basename=."
PAUSE
for /F "tokens=1 delims=_" %%a in ('dir /B /A-D') do (
   set "filename=%%a"
   setlocal EnableDelayedExpansion
   for /F "delims=" %%c in ("!basename!") do if "!filename:%%c=!" equ     "!filename!" (
      set "basename=!filename!"
      md "!basename!"
   )
   move "!filename!.%%b" "!basename!"
   for /F "delims=" %%c in ("!basename!") do (
      endlocal
      set "basename=%%c
      PAUSE
   )
)

Source. Of course this doesn't involve the FTP automation, but I figure I can get that part figured out with Filezilla or some other FTP software. Organizing once it's on our server is what this piece was trying to solve.

Community
  • 1
  • 1
  • 2
    If you could please [edit your post](http://stackoverflow.com/posts/38838906/edit) to include the code you have already tried that isn't working, that would be very helpful. – SomethingDark Aug 08 '16 at 22:57

1 Answers1

0

Do not try to hack this in a batch file.

It's a way easier to implement this in PowerShell:

$url = "ftp://user:password@ftp.example.com/remote/path/"
$request = [System.Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+Ftp]::ListDirectory
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream() 
$listing = $reader.ReadToEnd()
$reader.Close()
$response.Close()

$files =
    ($listing -split "`r`n") |
    Where-Object {$_ -like "*_*"}

$webclient = New-Object System.Net.WebClient 

foreach ($file in $files)
{
    $tokens = $file -split "_"
    $localPath = (Join-Path $tokens[0] $tokens[1])
    $localFilePath = (Join-Path $localPath $file)
    Write-Host ($file + " => " + $localFilePath)

    if (!(Test-Path -Path $localPath))
    {
        New-Item -Path $localPath -ItemType directory | Out-Null
    }

    $webclient.DownloadFile(($url + $file), $localFilePath)
}
Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992
  • Thank you! This is really close! It grabs three of each set's files but doesn't grab the base file 20160805_LUKDGUP nor 20160805_LUKSGUP. How does this handle overwriting? Files are up there 5 days at a time, for example M1 T1 W1 Th1 F1, then T1 W1 Th1 F1 M2, etc. I don't foresee it being a big deal if the previous files are overwritten though. – Matthew Sawyer Aug 09 '16 at 17:22
  • I've fixed the code for `20160805_LUKSGUP` + The files are overwritten. – Martin Prikryl Aug 09 '16 at 17:58
  • It does indeed grab them now, but they are put in their own folder LUKDGUP.SDL and LUKSGUP.SDL. I really appreciate your help with this! – Matthew Sawyer Aug 09 '16 at 18:27
  • What's `.SDL`? It's not mentioned in your question. And anyway, the SO is not a code writing service. I've shown you the concept, the rest is on you. – Martin Prikryl Aug 09 '16 at 20:44