I have an API call from my application to my other application via cURL, passing POST variables, like so:
$curl = curl_init();
curl_setopt_array($curl, array(
CURLOPT_URL => $url,
CURLOPT_POST => 1,
CURLOPT_POSTFIELDS => $paramString,
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_TIMEOUT => 600,
CURLOPT_CONNECTTIMEOUT => 60,
));
In this $url
application I try to fetch 20mln of records from one table and then pass it via JSON, like that:
public function apiMethod()
{
$response = $this -> _db -> fetchAll('SELECT t.hash FROM table t WHERE id BETWEEN 20000000 AND 40000000;');
echo json_encode($response);
}
Using PDO of course.
Ok, there are two problems with that: 1. FetchAll() doesn't work for so many records - memory exceeds. 2. Json doesn't work for so many records - there's max size for json.
I've been thinking about calling cURL many times and fetching every time, let's say, 100,000 records instead of fetching all. Is this the only way? What's the best way to do that?