-1

I have a url which contains more than a hundred xlsx files:

url = https://www.mrud.ir/%D9%85%D8%B3%DA%A9%D9%86/%D8%A7%D9%82%D8%AA%D8%B5%D8%A7%D8%AF-%D9%85%D8%B3%D9%83%D9%86-%D9%88-%D8%A8%D8%B1%D9%86%D8%A7%D9%85%D9%87-%D8%B1%D9%8A%D8%B2%D9%8A/%D8%A2%D9%85%D8%A7%D8%B1-%D9%88-%D8%A7%D8%B7%D9%84%D8%A7%D8%B9%D8%A7%D8%AA#196661381-------

I am looking to find that helps to download and save all files on disk before I can bind them together.

Ronak Shah
  • 377,200
  • 20
  • 156
  • 213

1 Answers1

2

Use wget command:

wget --no-parent -r http://WEBSITE.com/DIRECTORY

(You may need to enclose the URL in quotes)

If you need an R-specific solution, take a look at download.file util - and its built-in support for wget

Also see:

Also see this prior question and discussion: