0

We have Windows 2012 R2 AWS ec2 instance where there is a particular folder structure created for a COTS application. We have built fault tolerance and any time this instance goes down and another ones comes up, the new instance installs everything from scratch. The challenge is, copying the folder structure into the new one. The folder structure is quite deep (5 level) and I would like to avoid manually create these 100's of folder structure when the new instance is coming up.

To Illustrate, my current ec2 has:

C:\ABC
C:\ABC\sub1
C:\ABC\sub2
...
C:\ABC\subn

C:\ABC\sub1\child1-sub1
C:\ABC\sub1\child2-sub1
...
C:\ABC\sub2\child1-sub2
C:\ABC\sub2\child2-sub2
...

so on..

My idea is if I can copy the folder structure (without files) into a variable, then I can write the variable into a file and I can copy the file into S3. When the new instance comes up, read this file from S3, get the structure and re-create it.

I tried using " robocopy $source $dest /e /xf *.*", but $dest is a directory. I need to store the results into some kind of variable which can be stored somewhere.

Any suggestions/ thoughts?

John Kens
  • 1,615
  • 2
  • 10
  • 28
Suvro Choudhury
  • 115
  • 1
  • 5
  • 18
  • Running "dir /S /AD /B" in a folder will give you a complete list of subfolders. Perhaps you could redirect that to a file then use that list as the basis for re-creating the folder structure. Note that, if command extensions are enabled, mkdir creates any intermediate folders needed so you really only need to do this for the leaf folders. – jarmod Aug 01 '18 at 01:16
  • Can't you just use [robocopy | Microsoft Docs](https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy) across a network share? – John Rotenstein Aug 01 '18 at 03:49
  • well, the network share doesn't exist when this instance is running. The new instance only comes when existing one dies. – Suvro Choudhury Aug 03 '18 at 00:07

1 Answers1

0

You can use the tree command to get the directory structure.

In PowerShell, you can use the tree command to print a tree structure, starting from the current directory to it's descendants at deepest levels. You can also specify a custom path as an argument

Then as you said you can store that in S3.

Finally you can run commands at startup to read from S3 and create the folder hierarchy in the new EC2 instance

Arafat Nalkhande
  • 11,078
  • 9
  • 39
  • 63
  • I could not run tree command successfully to import the folder structure. I followed https://stackoverflow.com/questions/27447014/use-powershell-to-generate-a-list-of-files-and-directories, and ran Tree $Path /F | Select-Object -Skip 2 | Set-Content C:\temp\output.tkt, but lots of junk chars in the output file. I'm not sure how to use this and export it back to the new instance. – Suvro Choudhury Aug 03 '18 at 00:02
  • The closet I could get is, running a robocpy command to create a dummy directory and save the log to a file. Later delete the dummy directory. The log shows all the listed folders and sub-folders, which can be used later to create folders/sub-folders. This is a crude way, but that's what I have got at this moment. robocopy $source $dest /e /xf *.* /log:C:\software\logs\robo.log and rm -Force $dest – Suvro Choudhury Aug 03 '18 at 00:06