i have a requirement to make report using XLSX file , this report may contains 10.000-1.000.000 rows of trx. i made decision using phpspreadsheet from https://phpspreadsheet.readthedocs.io/en/latest/
The problem is it takes too long to write 10.000 hours of which each record consist of 50 columns. its nearly 24 hours and the script still in running and the progress is 2300/10000, here my codes :
<?php
require 'vendor/autoload.php';
use PhpOffice\PhpSpreadsheet\Spreadsheet;
$client = new \Redis();
$client->connect('192.168.7.147', 6379);
$pool = new \Cache\Adapter\Redis\RedisCachePool($client);
$simpleCache = new \Cache\Bridge\SimpleCache\SimpleCacheBridge($pool);
\PhpOffice\PhpSpreadsheet\Settings::setCache($simpleCache);
$process_time = microtime(true);
if(!file_exists('test.xlsx')) {
$spreadsheet = new Spreadsheet();
$writer = new \PhpOffice\PhpSpreadsheet\Writer\Xlsx($spreadsheet);
$writer->save("test.xlsx");
unset($writer);
}
for($r=1;$r<=10000;$r++) {
$reader = new \PhpOffice\PhpSpreadsheet\Reader\Xlsx();
$spreadsheet = $reader->load("test.xlsx");
$rowArray=[];
for($c=1;$c<=50;$c++) {
$rowArray[]=$r.".Content ".$c;
}
$spreadsheet->getActiveSheet()->fromArray(
$rowArray,
NULL,
'A'.$r
);
$writer = new \PhpOffice\PhpSpreadsheet\Writer\Xlsx($spreadsheet);
$writer->save("test.xlsx");
unset($reader);
unset($writer);
$spreadsheet->disconnectWorksheets();
unset($spreadsheet);
}
$process_time = microtime(true) - $process_time;
echo $process_time."\n";
notes :
- i propose CSV file but the clients only wants XLSX
- without redis cache it gives memory error even only <400 record
- im not intended to read .XLSX using php, only write action. looks like the library reads the entire spreadsheet
- at example above it takes open-close file every 1 record file, when im doing open->writeall->close its shows memory error at the mid progress