1

Our users upload files to our S3 bucket.

We need to download these files onto local computers.

We have an operations portal, and we would like for there to be a button that essentially triggers the terminal to open and begin to SFTP these files into local storage so that our operators can verify that these upload were successful and maintained integrity.

Is this possible? We are running PHP backend and JavaScript on the front-end.

I thought perhaps shell_exec could be helpful (running in PHP), but am I wrong in saying that running this command when our PHP is on our server will not hit our local storage at all? Or am I massively misunderstanding this, because I feel like that could cause some serious security issues.

Do I need a desktop-based application for this?

As it stands, our workaround is to allow our operators to copy and paste the specific command to download the files in S3. It just seems a bit manual and I feel there must be a better way!

D'Arcy Rail-Ip
  • 11,505
  • 11
  • 42
  • 67
  • 1
    PHP running on the server does not have any access to the users computer. – JimL Mar 22 '16 at 19:29
  • Okay, just as I thought - any ideas on how to achieve what I'm looking for? – D'Arcy Rail-Ip Mar 22 '16 at 19:30
  • cmorrissey, those CLI AWS commands will be helpful for sure, but is there any way to trigger them without pasting them into the terminal? – D'Arcy Rail-Ip Mar 22 '16 at 19:32
  • Check out the official AWS SDK for PHP https://aws.amazon.com/sdk-for-php/ ... If you want to set up a button that downloads certain files, you could easily have a web front-end that is backed by a PHP server running the AWS SDK that I linked in this comment. – wilkesybear Mar 22 '16 at 19:33
  • what OS are your local machines running that need to sync up? – cmorrissey Mar 22 '16 at 19:34
  • We are running a Mac OS. – D'Arcy Rail-Ip Mar 22 '16 at 19:36
  • You can use `Launchd` to run one of the sync commands every 5 minutes on a local machine. http://www.splinter.com.au/using-launchd-to-run-a-script-every-5-mins-on/ – cmorrissey Mar 22 '16 at 19:42
  • Interesting. That could definitely solve some of the issues here but I think requires some thought - our S3 server is looking to hold hundreds of TB of data, while our local NAS will only hold up to 40 TB. Will need to think about when to download files, and which to download... – D'Arcy Rail-Ip Mar 22 '16 at 19:49

0 Answers0