-1

I'm searching for a method how to programmatically download all photos from any Facebook profile. (I don't mean thumbnails, but the whole photos). I want to somehow use wget to get the code of the page, get child a elements of all div with class="rq0escxv rj1gh0hx buofh1pr ni8dbmo4 stjgntxs l9j0dhe7", follow the link and download the image from that link.

But how do I get the website code with all of the pictures loaded? I would like to get the command or some code (not in python) to get the code to for example profile.html.

Thanks!!

RORAK
  • 26
  • 7

1 Answers1

0

But how do I get the website code with all of the pictures loaded? I would like to get the command or some code (not in python) to get the code to for example profile.html.

If plain use of wget i.e. something like

wget -O profile.html url

where url is URL of page you target does not give profile.html with links to all images then said site probably use JavaScript to load additional images, e.g. in response to user scroll. In such case wget will not suffice, as it does not support JavaScript execution, you would need to use web scraper which does. Select one which does use language you tolerate and allows you to download images.

Daweo
  • 31,313
  • 3
  • 12
  • 25
  • So can you tell me **how** do I use javascript to load additional images? Thanks – RORAK Jul 01 '22 at 11:30
  • if it is sufficient to scroll to bottom of page then see https://stackoverflow.com/questions/11715646/scroll-automatically-to-the-bottom-of-the-page – Daweo Jul 01 '22 at 11:38
  • that doesn't work because these image elements are loaded continuously, so it scrolls to the bottom of the page, but then new elements are added so... Also you have to set the browser to Mozilla for `wget` to work because Facebook doesn't support wget. And if you do that, the final HTML looks like this: ```html – RORAK Jul 01 '22 at 12:05