1

Imagine I have the following function:

public function getSight() {
    $ids = array(1,2);
    $get1 = file_get_contents("https://maps.googleapis.com/maps/api/place/details/json?placeid=" . $ids[0] . "&key=");
    $get2 = file_get_contents("https://maps.googleapis.com/maps/api/place/details/json?placeid=" . $ids[1] . "&key=");
}

I'm curious, would that function take twice as long to execute compared to the same function with only 1 GET request? What if I had 10 ids in the array and I wanted to run 10 GET requests, would it run 10 times slower? Is there any way to execute those GET requests simultaneously instead of one after another?

I asking this because in order to get the information about a certain place from Google Places API, you have to make a GET request with the id of the place, however, I don't think the API allows me to get multiple places in a single GET request so I have to execute as many GET request as I have saved ids in the database. Surely there's a way to do this efficiently.

Onyx
  • 5,186
  • 8
  • 39
  • 86
  • https://stackoverflow.com/questions/124462/how-to-make-asynchronous-http-requests-in-php – Marty Apr 20 '19 at 06:37

1 Answers1

2

Maybe not with file_get_contents, but as far as I know you can do that via curl_multi_init:

https://php.net/manual/en/function.curl-multi-init.php

Using file_get_contents if one of your request fails or takes long, your whole script will be slow. I would suggest you to set some timeout value as well.

Daniel Kemeny
  • 640
  • 1
  • 5
  • 11