2

I have a page on my website that is fetching flight details from https://hotelspro.com

After entering the flight details, I am getting around 400 different records. The problem is that I have to loop in each record, and send individual API requests to get the hotel info.

I can only use PHP or JavaScript.

Using PHP, its taking like forever. I have tried several solutions:

PHP - Solution 1: Using curl_exec()

$count = xxx; //Where xxx is the number of hotels
for ($i = 0; $i < $count; $i++) {
    $hotelCode = $json["results"][$i]["hotel_code"];
    $url = "http://cosmos.metglobal.tech/api/static/v1/hotels/" . $hotelCode . "/" ;

    $username = 'xxx';
    $password = 'xxx';
    $auth = base64_encode("$username:$password");

    $curl = curl_init();
    $data = array(
        CURLOPT_URL => $url,
        CURLOPT_RETURNTRANSFER => true,
        CURLOPT_ENCODING => "",
        CURLOPT_MAXREDIRS => 10,
        CURLOPT_TIMEOUT => 0,
        CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
        CURLOPT_CUSTOMREQUEST => "GET",
        CURLOPT_HTTPHEADER => array(
            "authorization: Basic $auth",
            "cache-control: no-cache",
            'Content-Length: 0'
        ),
    );

    curl_setopt_array($curl, $data);
    $response = curl_exec($curl);
    $err = curl_error($curl);
    curl_close($curl);

    $json = json_decode($response, true);
}

PHP - Solution 2: Using curl_multi_exec()

The below code is copied from an answer on Stack Overflow

$urls = array() ;
for ($i = 0; $i < $count; $i++) {
    $urls[] = "http://cosmos.metglobal.tech/api/static/v1/hotels/" . $json["results"][$i]["hotel_code"] . "/" ;
}

/* ********** Start Multi Threading ********** */
$active = 0 ;

// cURL multi-handle
$mh = curl_multi_init();

// This will hold cURLS requests for each file
$requests = array();

$options = array(
    CURLOPT_FOLLOWLOCATION  => true,
    CURLOPT_AUTOREFERER     => true,
    CURLOPT_HTTPHEADER      => array("Content-Type: application/json"),
    CURLOPT_SSL_VERIFYPEER  => false,
    CURLOPT_RETURNTRANSFER  => true,
    CURLOPT_USERPWD         => "xxx:xxx"
);

foreach ($urls as $key => $url) {
    // Add initialized cURL object to array
    $requests[$key] = curl_init($url);

    // Set cURL object options
    curl_setopt_array($requests[$key], $options);

    // Add cURL object to multi-handle
    curl_multi_add_handle($mh, $requests[$key]);

    $active++;
}

// Do while all request have been completed
do {
   curl_multi_exec($mh, $active);
} while ($active > 0);

// Collect all data here and clean up
$j = 0 ;
foreach ($requests as $key => $request) {
    $result = curl_multi_getcontent($request); // Use this if you're not downloading into file, also remove CURLOPT_FILE option and fstreams array
    curl_multi_remove_handle($mh, $request); //assuming we're being responsible about our resource management
    curl_close($request);                    //being responsible again.  THIS MUST GO AFTER curl_multi_getcontent();

    $json["results"][$j]["hotel_details"] = json_decode($result, true) ;
    $j++ ;
}

curl_multi_close($mh);
/* ********** End Multi Threading ********** */

Both PHP Solutions take more that 1 minute to loop through all the records. So I am trying now to send the requests using JavaScript Synchronous Requests.

JavaScript - Solution 1:

for (var i = 0; i < maxCount; i++) {
    var thisHotel = extJson.hotels.results[i] ;
    var hotelCode = thisHotel.hotel_code;

    $.get("/travel/hotel_pro_details/" + thisHotel.hotel_code, function (json) {  // /travel/hotel_pro_details/ is a function on my website that calls HotelsPro API
        //Code Handling Here
    }
}

The above JavaScript Solution is also taking a lot of time, but the positive thing in it is that I can append the results 1 after the other after being fetched from the API.

But I am looking for a better solution to reduce the load time. Since I am looping through the records in JavaScript, I am not able to send all the records at once and wait for the results.

So my question is:

Is there a way using JavaScript, where I can send multiple records in a single AJAX call, and then handle all the replies one by one ?

Thank You...

Mario Rawady
  • 871
  • 7
  • 17
  • 1
    You can achieve this by using Promises or even better use ```try/catch``` with ```async/await``` – Adeel Imran Dec 12 '17 at 08:38
  • 1
    Can you send me a link to an online example, or explain more. Because I am not so familiar with them – Mario Rawady Dec 12 '17 at 08:39
  • https://stackoverflow.com/questions/16026942/how-do-i-chain-three-asynchronous-calls-using-jquery-promises – Philipp Meissner Dec 12 '17 at 08:40
  • https://stackoverflow.com/questions/43616018/call-multiple-api-urls-and-call-at-same-time if you find this link useful, kindly give it a thumbs up. – Adeel Imran Dec 12 '17 at 08:42
  • The problem is that the PHP solution is getting *all* the results before returning anything. With a typical Ajax solution you'd get them 1-by-1 at the client, so the user isn't waiting forever for a ton of results when they only want the top 5 (typically). I'd recommend just going down the Ajax route, maybe with lazy loading where the data is only loaded for visible elements. – Reinstate Monica Cellio Dec 12 '17 at 08:56
  • @PhilippMeissner I want the requests to be sent Synchronously (All at the same time) – Mario Rawady Dec 12 '17 at 09:27
  • @AdeelImran I want the requests to be sent Synchronously (All at the same time) – Mario Rawady Dec 12 '17 at 09:27
  • @Archer I am using now the solution you recommended. But I want to make it faster since it is taking more than 100 sec to load all the records – Mario Rawady Dec 12 '17 at 09:28
  • You shouldn't be loading all the records though - that's the problem and you can't avoid it taking that long. Only make Ajax calls for data that you need to display (visible elements) – Reinstate Monica Cellio Dec 12 '17 at 09:29
  • Browsers all have a max concurrent ajax requests limit, so if you have a hard limit of e.g 8, you will only be able to make calls of 8 at a time. – JohanP Dec 12 '17 at 10:21
  • You can not send a single Ajax request to fetch multiple endpoints at once. That does not even make sense if you think about it. Either create one large promise that waits for all requests to finish or create N single requests that you handle individually. If that does not improve your loading the bottleneck is the backend and you should try to improve the loading times there. – Philipp Meissner Dec 13 '17 at 06:33

1 Answers1

1

If you are prepared to use an external library, I guess you could consider rxjs and its Observable pattern, with this you can use their forkJoin method to send all your $get requests simultaneously:

Rx.Observable.forkJoin(array)
  .subscribe(function(data) { 
    /* data for all requests available here */ 
  }


Demo:

$(document).ready(function() {
  // use your for loop to push() your requests into an array
  var array = [
    $.get('https://api.chucknorris.io/jokes/random'),
    $.get('https://api.chucknorris.io/jokes/random'),
    $.get('https://api.chucknorris.io/jokes/random'),
    $.get('https://api.chucknorris.io/jokes/random')
  ];

  // make all the http request at the same time
  Rx.Observable.forkJoin(array)
    .subscribe(function(data) {
      // Handle the response for each request individually
      $('body').append(data[0].value + '<br><hr>');
      $('body').append(data[1].value + '<br><hr>');
      $('body').append(data[2].value + '<br><hr>');
      $('body').append(data[3].value + '<br>');
    })
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/4.0.7/rx.all.js"></script>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
Michael Doye
  • 8,063
  • 5
  • 40
  • 56
  • I tried to use the mentioned method, but the response time didn't improved. I tried calling 10 API calls at the same time, and it was waiting till all the calls are executed and then send me all the results at the same time. I want to be able to retrieve each call alone when it is ready, and not wait for all of them to load – Mario Rawady Dec 12 '17 at 15:45