1

I can download things from my controlled server in one way - by passing the document ID into a link like so :

https://website/deployLink/442/document/download/$NUMBER

If I navigate to this in my browser, it downloads the file with ID $NUMBER.

The problem is, I have 9,000 files on my server.

How can I neatly download them all into one folder ? I guess a JavaScript solution would be best here ?

for(i=0; i<=9000; i++) {
  download("C:/Users/FolderOfFiles", "https://website/deployLink/442/document/download/" + i)
}

is the functionality I want. Whats the best and cleanest way to implement this?

I should add - this is the ONLY way I can download from the server, there is one API call exposed to me and I do not have FTP access. This is the only way :).

Simon Kiely
  • 5,880
  • 28
  • 94
  • 180
  • I would do it with php like this http://stackoverflow.com/questions/724391/saving-image-from-php-url, using the `file_put_contents` function – Jacob Mar 25 '15 at 15:54
  • Since the IDs are sequential, I would use [wget](https://www.gnu.org/software/wget/). Just type this: `wget https://website/deployLink/442/document/download/{0001..9000}` – burningfuses Mar 25 '15 at 16:01

1 Answers1

2

Since you want to download them to the file system, javascript is probably not a viable option. That being said, there is a ton of ways to do this. Open up your command line and navigate to the folder you want to dump the files in.

If you're on Linux, you can use wget:

for i in {1..9000}; do wget https://website/.../download/$i; done

If you're on Windows, the easiest way to do this is probably still wget:

FOR /L %i in (1,1,9000) do wget https://website/.../download/%i

Hope that points you in a helpful direction.

Austen
  • 418
  • 6
  • 11