0

I have to use cURL on Windows using python script. My goal is: using python script get all files from remote directory ... preferably into local directory. After that I will compare each file with the files stored locally. I am able to get one file at a time but I need to get all of the files from remote directory. Could someone please advice how to get multiple files? I use this command: curl.exe -o file1.txt sftp:///dir1/file1.txt -k -u user:password

thanks

susja
  • 311
  • 3
  • 12
  • 33
  • Windows doesn't have `curl` unless you use CygWin. PowerShell 3+ has `Invoke-WebRequest` (aliased to `curl`, but args are different). In any event, if you're using Python, why not use Python (`urllib`)? – Nick T Sep 24 '14 at 17:57
  • 3
    if you "must" use curl, how does this have anything to do with python. – cmd Sep 24 '14 at 17:57
  • Even if you need some features from `curl` that you don't know how to map to `urllib`, is there a reason you don't want to use `pycurl`? – abarnert Sep 24 '14 at 18:02
  • Also, see http://stackoverflow.com/a/395481/194586 – Nick T Sep 24 '14 at 18:02
  • @NickT: You can also install curl from an installer on their website, or via MinGW; in either case it's a native Win32 app named `curl.exe` rather than a cygwin app named `curl`. So, I think the asker has already gotten that part working. – abarnert Sep 24 '14 at 18:04
  • @abarnert fair enough, though if I was making something on Windows I'd just try to use what is guaranteed to be there (Windows 8 has `iwr`), or minimize extra requirements (so I don't have to install Python, *and* `curl`/MinGW, etc.). In any event...if Python then Python. – Nick T Sep 24 '14 at 18:07
  • You mean `if Python: Python`, right? (Or maybe `Python if Python else Python_damnit`?) :) But yeah, I understand what you mean. It's sometimes tough to decide whether Windows 8 or later is a more or less onerous requirement than curl, but `urllib` is definitely less than either if you're already requiring Python. – abarnert Sep 24 '14 at 18:37
  • Well .. in my case I already have curl working on Windows. I will have to get files not from url but using sftp. I made it working for me but as I mentioned it allow me to get only 1 file at a time and my goal is to get all files at once. Not sure if other python packages could handld sftp and I'm kind of close to my goal except the one I asked – susja Sep 24 '14 at 18:40
  • Folks, I planned to call curl from inside the python script. Are you saying that I could try to use Python to get files using sftp? Which module will do it? If it's not so complex I might change my mind and switch to Python module instead of calling curl from inside the script – susja Sep 24 '14 at 18:53
  • ahh -- sftp changes things a bit. There's a very good sftp implementation for native Python included in paramiko, but that's not standard-library. – Charles Duffy Sep 24 '14 at 19:02

1 Answers1

0

I haven't tested this, but I think you could just try launching each shell command as a separate process to run them simultaneously. Obviously, this might be a bad idea if you have a large set of files, so you might need to manage that more carefully. Here's some untested code, and you'd need to edit the 'cmd' variable in the get_file function, of course.

from multiprocessing import Process
import subprocess

def get_file(filename):
    cmd = '''curl.exe -o {} sftp:///dir1/{} -k -u user:password'''.format(filename, filename)
    subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT) # run the shell command

files = ['file1.txt', 'file2.txt', 'file3.txt']

for filename in files:
    p = Process(target=get_file, args=(filename,)) # create a process which passes filename to get_file()
    p.start()
Derek Kurth
  • 1,771
  • 1
  • 19
  • 19
  • 1
    Thanks for your suggestions but based on your advices I successfully completed my goal using pysftp – susja Sep 25 '14 at 21:34