I do have permission to do this.
I've got a website with about 250 pages from which I need to download the 'product descriptions' and 'product images'. How do I do it? I'd like to get the data out into a CSV, so that I can put it in a DB table. Could someone point me to a good tutorial to get started on this? I should be using cURL, right?
So far, I got this from another stackoverflow page, How do I transfer wget output to a file or DB?:
curl somesite.com | grep sed etc | sed -e '/^(.*)/INSERT tableName (columnName) VALUES (\1)/' |psql dbname
And I created this, which sucks, to get the images:
#!/bin/bash
lynx --source "www.site.com"|cut -d\" -f8|grep jpg|while read image
do
wget "www.site.com/$image"
done
by watching this video: http://www.youtube.com/watch?v=dMXzoHTTvi0.