Is it possible to use wget command for linux in order to get all files in a directory tree of a website?
I can recursively get all of a website with mirror and such, but I would like to just get all files in a single directory. In my mind, it would look something like:
wget http://www.somesite.com/here/is/some/folders/*
This would download ALL files (doesn't have to recursively look in subdirectories) in the /folders/ directory. But the wildcard character doesn't seem to work with wget so I am looking for the correct way.