0

I'm building a php 'web crawler' to populate a database with housing records which I then want to use for analysis. The problem is that the website that I'm trying to access has 2 pages:

  • on the first, I need to select an option from a drop-down list after which the page auto re-loads (the page is using a POST form submit)
  • when the POST variable is filled, it loads the individual nodes that I'm interesting in

My question is whether I can directly access this 2nd page and read the data into a string using php? Normally I would just use file_get_contents() for this, but I don't know how I can add a POST header as additional input to this?

j08691
  • 204,283
  • 31
  • 260
  • 272
  • [Sure](http://stackoverflow.com/a/6609181/1870760), just send the post and read the response. – Hatted Rooster Nov 09 '16 at 19:21
  • You can use cURL to make HTTP requests in code and read the responses. – David Nov 09 '16 at 19:22
  • 1
    http://stackoverflow.com/questions/5647461/how-do-i-send-a-post-request-with-php looks like a dupe to me – Funk Forty Niner Nov 09 '16 at 19:22
  • just found this: http://stackoverflow.com/questions/2445276/how-to-post-data-in-php-using-file-get-contents is there any good reason to use curl instead? – Thomas Blomme Nov 09 '16 at 19:25
  • @ThomasBlomme Yes. cURL gives you a lot more flexibility in the request and the response, plus there's multi_curl so you can make many requests at once. The fopen wrapper is more of a utility hack and isn't really meant as a full HTTP client. The HTTP fopen wrapper is actually disabled in many environments. – Brad Nov 09 '16 at 19:27

0 Answers0