0

For example, when I open https://stackoverflow.com/ in browser, the browser will download not only the main page, but also images, js, css.

But when I do curl https://stackoverflow.com/, only the main page html is downloaded. Is there any options of curl or wget that can download images/js/css also?

Or any other tools can do this?

Sato
  • 8,192
  • 17
  • 60
  • 115
  • possible duplicated? https://stackoverflow.com/questions/6348289/download-a-working-local-copy-of-a-webpage – Max Forasteiro Dec 28 '17 at 01:19
  • Possible duplicate of [Download a working local copy of a webpage](https://stackoverflow.com/questions/6348289/download-a-working-local-copy-of-a-webpage) – rink.attendant.6 Dec 28 '17 at 01:50

1 Answers1

0

wget -r will save everything

wget -r www.your-site.com
Alfred George
  • 126
  • 1
  • 8