1

I am trying to fetch huge amount of data from one mysql table to be exported as XLSX file.

I used fetchAll() function, but I got

Fatal error: Out of memory

Here's my code:

<?php
require_once 'classes/Spout/Autoloader/autoload.php';
use Box\Spout\Writer\WriterFactory;
use Box\Spout\Common\Type;

$query  = "SELECT *
           FROM full_report";

$header = array('DATA','STORE','FROM','TO','DATE','YEAR','MONTH','ITEM','SIZE','DEPT','SUBDEPT','DESC1','DESC2','GENDER','ATTR','VEND','SEASON','INVO#','TRANS#','QTY','MSRP','RTP','COST','T.RTP','T.COST','PAYMENT','STATUS');

$mDb->query($query);
$result = $mDb->fetchAll(); // Here where I get the error!

$fileName = "fullReport-" . date('m-d-Y-H-i-s') . ".xlsx";
$path     = "_uploads/" . $fileName;

$writer = WriterFactory::create(Type::XLSX); // for XLSX files
$writer->openToFile($path); // write data to a file or to a PHP stream
$writer->openToBrowser($path); // stream data directly to the browser
$writer->addRow($header);

foreach ($result as $value)
{
    unset($value['id']);
    unset($value[0]);
    $valuex[] = array_values($value);
}
$writer->addRows($valuex);
$writer->close();

Any suggestions?

CairoCoder
  • 3,091
  • 11
  • 46
  • 68
  • a server with more memory? Or probably you can look to up the memory limits of your applications or web server, see for example http://stackoverflow.com/questions/13955914/php-out-of-memory-error-even-though-memory-limit-not-reached and check this for your php limits http://techpp.com/2009/07/10/how-to-fix-php-fatal-error-out-of-memory/ – Joe T Oct 14 '16 at 02:56

2 Answers2

2

fetchAll is the problem. What it does is get all the matching rows from the table and load everything in memory. It works when you don't have too many rows to fetch but causes Out of Memory errors when the number of rows to be stored exceed the available memory amount.

To fix this problem, you should fetch your data in multiple chunks. You can use the fetch method instead and a cursor. It is well documented on the PHP.net manual. You can also take at this repo: https://github.com/adrilo/spout-pdo-example. It gives you an example for using MySQL and Spout together, in a scalable way.

Adrien
  • 1,929
  • 1
  • 13
  • 23
  • That worked, but having a lot of queries will get the server down, I did that and the serve went down. – CairoCoder Oct 17 '16 at 14:40
  • Your database should be able to easily handle the load if you send the correct requests. Something you need to know is that the offset/limit pattern does not work. Instead, you need to use the last_fetched_id/limit pattern so that your DB can skip the rows that were already fetched. If you look at the example in the link I gave, that's how it works. – Adrien Oct 17 '16 at 20:19
  • Yes, I know the example, I already made the whole solution and it worked perfectly, except if I get a huge number of data, the server goes down due to number of queries. If you could advise of how to optimize the mysql database in order to handle those queries, I would appreciate your help. – CairoCoder Oct 18 '16 at 02:16
  • My advice would be to make sure you have the right indexes in place. Otherwise you will have to do a full table scan, which will definitely take your site down. You can use the `EXPLAIN` keyword (assuming you're using MySQL) to understand how MySQL is fetching the data and if it uses indexes or not. If you are using indexes, then add a `sleep` in your code so that you don't overload your DB. – Adrien Oct 18 '16 at 16:48
0

I will suggest you to use SELECT * FROM your table name into out file 'folder path / yourlfilename.extension ' ; customise the query with your requirement. It can customize easily with your requirement. Just study the mysql into outfile function. In case of huge data it is the best solution. Make sure your last folder has permission '777' and no file exist with that name. The query generates file with it self.