I'm interested in downloading for later analysis a bunch of webpages. There are two things that I would like to do:
- Download the page and associated resources (images, multiple pages associated with an article, etc) to a WARC file.
- change all links to point to the now local files.
I would like to do this in Python.
Are there any good libraries for doing this? Scrapy seems designed to scrape websites, rather than single pages, and I'm not sure how to generate WARC files. Calling out to wget is a doable solution if there isn't something more python native. Heritrix is complete overkill, and not so much of a python solution. wpull would be ideal if it had a well documented python library, but it seems instead to be mostly an application.
Any other ideas?