I'm trying to scrape web pages.
I want to download a web page by providing its URL and save it for offline reading with all its images. I can't manage to do that with wget since it creates many directories.
Is this possible with wget? Is there something like the "Save as" option in FireFox which creates a directory and puts all required resources into that with an HTML page?
Would it be possible to do this Nokogiri or Mechanize?