0

I have do download all log files from a virtual directory within a site. The access to virtual directory is forbidden but files are accessible.

I have manually entered the file names to download

dir="Mar"
for ((i=1;i<100;i++)); do
   wget http://sz.dsyn.com/2014/$dir/log_$i.txt
done

The problem is the script is not generic and most of the time I need to find out how many files are there and tweak the for loop. Is there a way to trigger wget to fetch all files without me bothering to specify the exact count.

Note: If I use the browser to view http://sz.dsyn.com/2014/$dir, it is 403 forbidden. I cant pull all the files via browser tool/extension.

PersianGulf
  • 2,845
  • 6
  • 47
  • 67
Tom Iv
  • 409
  • 1
  • 5
  • 21
  • Is there a realistic maximum? Why not just loop to a high number? If it fails, it'll fail quickly and you can just ignore it. – arco444 Mar 11 '14 at 14:07

2 Answers2

1

First of all check this similar question If this is not what you are looking for, you need to generate a file of URLs within and feed wget. e.g.

 wget --input-file=http://sz.dsyn.com/2014/$dir/filelist.txt
Community
  • 1
  • 1
hawk
  • 5,409
  • 2
  • 23
  • 29
0

wget will have the same problem your browser has: it cannot read the directory. Just pull until your first failure then quit.

Bruce K
  • 749
  • 5
  • 15