2

Hello folks and friends from SO!

This is for the development of a web spider. We basically need to take a snapshot of a certain website. We would have the full URL, and as well the source code, returning from a cURL request.

Since the HTML should be visualized as a browser does, I believe we're gonna have to draw it somehow before taking the screenshot.

Is there any recommendation or proper approach to make this?

Thanks in advance; Chris;

Chris Russo
  • 450
  • 1
  • 7
  • 21
  • 1
    You're right. Duplicate! Thanks Alex. How can we close this? :) – Chris Russo May 21 '13 at 03:30
  • You can delete it or it will be closed after 3 more close votes. – Alex W May 21 '13 at 03:31
  • @ChrisRusso yes, you can delete it. but have you tried getting the page with `file_get_contents()` ? – samayo May 21 '13 at 03:32
  • @phpNoOb He would still need to render it before taking a screenshot. – Alex W May 21 '13 at 03:33
  • @phpNoOb, that's correct. We're already getting thousand of pages, and information from the system itself, using curl and sockets, but we still need something to display it properly. – Chris Russo May 21 '13 at 03:34
  • At the same time, the solution provided in the another question doesn't really fit our needs, since it would only render the code, but any external image, and complex tags, would be avoided. I'll reply my own question just for the record. – Chris Russo May 21 '13 at 03:35

1 Answers1

2

You'll more than likely have to install a PHP library on your server to get the task done, if there is something available in PHP. As PHP is a server-side language, it's not geared to do this type of work. Perhaps this previous thread can help you in the right direction. Website screenshots using PHP

However, if you are open to API services available that is price-based, Github uses: http://url2png.com/plans/

Community
  • 1
  • 1
Stuart Hannig
  • 341
  • 4
  • 8