I want to upload files to my servers at Digital Ocean and AWS. I can do that via the terminal using scp or sftp, but I want to automate this and do it in Python or any other programming language. In case of Python, how can I upload a file to a server in high level, should I use sftp client? Any other options?
Asked
Active
Viewed 1,004 times
0
-
Why not a shell script? – Ignacio Vazquez-Abrams Jun 08 '16 at 05:10
-
https://pypi.python.org/pypi/scp – MaxNoe Jun 08 '16 at 06:26
2 Answers
3
You can use pysftp package;
import pysftp
with pysftp.Connection('hostname', username='me', password='secret') as sftp:
with sftp.cd('public') # temporarily chdir to public
sftp.put('/my/local/filename') # upload file to public/ on remote
sftp.get_r('myfiles', '/backup') # recursively copy myfiles/ to local
https://pypi.python.org/pypi/pysftp
It also uses paramiko internally I guess which can be used for ssh, sftp etc. http://docs.paramiko.org/en/1.17/api/sftp.html

cdagli
- 1,578
- 16
- 23
-
-
1You need to add private_key parameter while connecting, it is all documented; http://pysftp.readthedocs.io/en/release_0.2.8/cookbook.html – cdagli Jun 08 '16 at 06:28
-
-
1For example; for google cloud you can use "gcloud compute copy-files" command to copy local files to remote instances. Or if they are bigger asset files you can use S3 or GCS to store them and both of them have command line tools. Or if you are talking about source files you can use git to deploy on your servers. There are other options but you've to be more specific about your needs. – cdagli Jun 08 '16 at 07:08
-
0
The AWS way to go about this would be to upload files to S3, and then make your buckets accessible by your DigitalOcean and AWS EC2 servers. This way you have a single central storage device for your files, that is highly durable and can be accessed by as many servers as you need (quickly scaleable).
The AWS SDK for python can be viewed here:
https://aws.amazon.com/sdk-for-python/
Alternative AWS S3 package (simples3)
http://sendapatch.se/projects/simples3/