2

I have a table with ~70k records and is ~12 MiB(MB) in size.

require_once('connection.php');
$sql = "SELECT * FROM BlahBlah";
$result = $con->query($sql);
$data = array();
while($row = $result->fetch_assoc()) {
  $data[] = $row;
}
echo json_encode($data);

When I do a while fetch_assoc() loop in PHP to store the result and echo back all the rows in the table in Json format to javascript, I get Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes).

I understand that more than 12 MiB of memory is definitely required, but allocating 10 times as much memory as needed throws me off. I have one other query before this and it only grabs a 151 KB table.

EDIT: I've read the blog post below to understand why PHP needs so much overhead.

One solution is to set the memory limit to unlimited, but that seems like a horrible approach as my database grows at ~10k records per day. Is there any other way to output all the rows of a table from MySQL in Json format to the client? I'm generating google map markers on the client side from the rows obtained

JJGong
  • 129
  • 2
  • 11
  • Possible duplicate of http://stackoverflow.com/questions/415801/allowed-memory-size-of-33554432-bytes-exhausted-tried-to-allocate-43148176-byte – Jay Blanchard Jul 27 '16 at 21:39
  • its basically impossible to determine what memory usage php will use by just looking at the running code –  Jul 27 '16 at 21:41
  • 2
    Try reading [this blog post](https://nikic.github.io/2011/12/12/How-big-are-PHP-arrays-really-Hint-BIG.html) which explains exactly what memory is used building arrays in PHP – Mark Baker Jul 27 '16 at 21:42
  • you'll have TWO copies of all your data in memory at some point. the original data array, and then the json-encoded copy of it. Even if that json is just immediately spit out as output, the entire json string is in ram for some amount of time. and note that PHP has overhead for data. an integer in php doesn't occupy JUST 2 or 4 bytes. there's an entire zval struct holding it and its metadata in th background. – Marc B Jul 27 '16 at 21:46
  • @JayBlanchard, I saw that, but apparently the approved answer is not recommended. I just want to understand why so much memory was used and if it is suppose to be this way. – JJGong Jul 27 '16 at 22:05
  • @MarkBaker, I've implemented a sample malloc() in C before, but Omg those overheads for each allocation is crazy. – JJGong Jul 27 '16 at 22:09
  • 1
    That's for PHP5, PHP7 is rather lower..... but try to understand what is actually being stored (and why) rather than simply complaining about how excessive it is – Mark Baker Jul 27 '16 at 22:12
  • @MarkBaker, no no I'm not complaining, I'm just shocked because I've never seen an overhead that large before. I'm assuming one can't just keep extending the memory limit forever, what is the standard way to handle such large data? I've looked at unbuffered mysqli queries, but given that I want a single whole set returned, it doesn't seem to fit. The only other way I can think of is to restructure the client side to allow unbuffered chunks – JJGong Jul 27 '16 at 22:18
  • 1
    No, you can't simply keep increasing the memory limit.... rather than building a large array in memory, and then json_encoding it, you might consider using a json stream writer like https://github.com/skolodyazhnyy/json-stream and streaming directly to php://output (as described [here](https://soyuka.me/streaming-big-json-files-the-good-way/)) – Mark Baker Jul 27 '16 at 22:23
  • @MarkBaker, thanks I'll look into that. Meanwhile, do I need 128 MB of available RAM on my server in order to store the PHP array? I have a small server with only 512 MB of RAM. Excluding dependencies and stuff, I would only have ~200-300 MB of RAM left. Increasing the memory limit would also require you to HAVE enough memory in the first place, correct? – JJGong Jul 27 '16 at 22:26
  • Each execution of a PHP script takes as much memory as it requires (up to the defined limit) and then releases that memory when the thread terminates, so that usage is transient as each thread will typically only run for a few seconds – Mark Baker Jul 27 '16 at 23:15

1 Answers1

1

I had same problem as you. I solve it by creating json string manually and sending it to user by parts as soon as possible (in batches of course).

require_once('connection.php');
$sql = "SELECT * FROM BlahBlah";
$result = $con->query($sql);


$json_str = '['; // we will send this to user.
$count = 0; // current number of fetched rows.
$max_count = 100; // max number of rows that we store in memmory bofore sending to client.

$quote = ''; 
// $quote will be used to concatenate json strings
// it's empty only on first iteration

while($row = $result->fetch_assoc()) {
  $json_str .= $quote . json_encode( $row );
  $quote = ',';
  $count++;
  if( $count >= $max_count ){
     echo $json_str;  //send all data that we have for now
     // if you don't want to echo it you can save it to file.

     $json_str = '';  //clear string (prevent memory growing) 
     $count = 0; 
  } 
}
echo $json_str .']';

UPDATE: Also there are other ways to reduce memory usage

  1. You can use SplFixedArray for $data to store all rows (in my practice it was 20-30% more efficient than php Array).
  2. You can json encode rows before adding them to $data. This is funny but json strings may be more memory efficient than Array.

But this methods are less memory efficient and slower.

Arnial
  • 1,433
  • 1
  • 11
  • 10