1

I have a mysql database. It's very huge database. When I select the data with 1M records, I should make the csv file on the disk. I make the PHP script. But It's killed by Linux. How can I make the PHP script with fast speed?

        $batches = $itemcount / 50000; 
        for ($i = 0; $i <= $batches; $i++) {

            $offset = $i * 50000;
            $sql = $sql_org . " LIMIT  $offset, 50000 ";

        $stmt = $db->prepare($sql);


            if($stmt) {
                if($stmt->execute()) {
                     $stmt->bind_result($FIRSTNAME, $LASTNAME, $PHONE....
AbraCadaver
  • 78,200
  • 7
  • 66
  • 87
  • Are you sure it's not https://www.php.net/manual/en/info.configuration.php#ini.max-execution-time – AbraCadaver May 14 '20 at 20:01
  • Yes. It's not execution time. I run the PHP script in terminal. –  May 14 '20 at 20:02
  • There are 1M records. So it's occured memory overflow. –  May 14 '20 at 20:03
  • Maybe do it in chunks of 10,000 or whatever works and append to the CSV each time. – AbraCadaver May 14 '20 at 20:05
  • Did you tried to fetch database data by small chunks, for example 1k records per query? – Mihail0v May 14 '20 at 20:05
  • why do you nt make it directly out of mysql with oufile. In terminal it doesn't really matter how big it is and skip php – nbk May 14 '20 at 20:06
  • I already used the small chunks. 50000 lines. But Making the csv file in AWS server will take 2 hours. –  May 14 '20 at 20:06
  • 1
    Use something like `SELECT blah blah FROM foo WHERE what INTO OUTFILE 'file.csv' FIELDS ENCLOSED BY '"' TERMINATED BY ';' ESCAPED BY '"' LINES TERMINATED BY '\r\n';` You don't need PHP. – AbraCadaver May 14 '20 at 20:07
  • Make it smaller. Are you using some ORM's or smth like that? Maybe there's some memory leak in your code, it would help if you provide the full snippet. – Mihail0v May 14 '20 at 20:08
  • $batches = $itemcount / 50000; for ($i = 0; $i <= $batches; $i++) { $offset = $i * 50000; $sql = $sql_org . " LIMIT $offset, 50000 "; $stmt = $db->prepare($sql); if($stmt) { if($stmt->execute()) { $stmt->store_result(); if($stmt->num_rows == 0) { fclose($output); return 0; } –  May 14 '20 at 20:10
  • @HeZhiYong you try use $stmt->free_result() to free already stored data. – Mihail0v May 14 '20 at 20:12
  • Linux doesn't kill programs randomly, there's a reason. Is it the OOM killer? (out of memory killer) if it's the case you can see that in the last lines of `dmesg`. If you're not sure try to increase your memory (adding physical or increasing the swap) and see if the program stil lgets killed. Also try to figure what signal it got. – Eric May 14 '20 at 22:59

1 Answers1

1

Install mysql client and piping out the output

shell_exec('mysql -u username -p "password" --database=dbname --host=AWShostname --port=AWSport --batch 
  -e "select * from `table`" 
  | sed #s/\t/","/g;s/^/"/;s/$/"/;s/\n//g# > /path/to/yourlocalfilename.csv')

Note: exporting huge amount of data from AWS is to expensive.

ExploitFate
  • 595
  • 2
  • 9