1

I tried for an hours, but i can't find the solution. I have 60 millions or more then record(mysql) i want to export as CSV file. Please tell me how to use the bucket systems for this process. If possible to export separate files is good for me.

<?php
// Fetch Record from Database
@ini_set('log_errors','On');
@ini_set('display_errors','On');
@ini_set('error_log','error.log'); // path to server-writable log file
@ini_set('max_execution_time', 0); //unlimited
@ini_set('memory_limit', '-1');
$output = "";
$connection= mysqli_connect('192.168.1.203','cnsdb','cnsdb','test_irmt');
$table = "tracking";
$sql = mysqli_query($connection,"select * from {$table} limit 0,2000");
$columns_total = mysqli_num_fields($sql);

// Get The Field Name
$query = "SHOW COLUMNS FROM {$table}";
$result_set = mysqli_query($connection,$query);
while ($result = mysqli_fetch_array($result_set)) {
    $heading = $result[0];
    $output .= trim($heading.',');
}

$output = substr($output,0,strlen($output)-1)."\r\n";

// Get Records from the table


// Download the file

$filename = "output".".csv";
header("Pragma: public", true);
header("Expires: 0"); // set expiration time
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Content-Type: application/force-download");
header("Content-Type: application/octet-stream");
header("Content-Type: application/download");
header('Content-Disposition: attachment; filename='.$filename);
header("Content-Transfer-Encoding: binary");
header('Connection: Keep-Alive');

while ($row = mysqli_fetch_array($sql)) {
    for ($i = 0; $i < $columns_total; $i++) {
        $output .='"'.$row["$i"].'",';
    }
    $output = substr($output,0,strlen($output)-1)."\r\n";
}
echo $output;
exit;

?>
Raj Mohan
  • 543
  • 9
  • 25
  • 2
    Possible duplicate of [Dump a mysql database to a plaintext (CSV) backup from the command line](https://stackoverflow.com/questions/467452/dump-a-mysql-database-to-a-plaintext-csv-backup-from-the-command-line) `SELECT * INTO OUTFILE` is your friend for heavy lifting. – ficuscr Sep 22 '17 at 18:19
  • 1
    using this approach, you're creating the entire csv file in memory before sending it off, and i guess you don't have that much memory available. send the csv off as it is being generated, instead of storing it all in ram before sending it – hanshenrik Sep 22 '17 at 18:19
  • Do your users really want 60 million records being dumped to their browser? AT least (if you absolutely have to do it in PHP) use the built-in fputcsv() function – Mark Baker Sep 22 '17 at 18:23
  • @ficuscr - I tried, but i can export only 10Million record only – Raj Mohan Sep 22 '17 at 18:24
  • @MarkBaker is there possible to split 1M record for each CSV. – Raj Mohan Sep 22 '17 at 18:26
  • Run it six times then with a limit offset. You'll be hard pressed to find a more performant approach. – ficuscr Sep 22 '17 at 18:27
  • @ficuscr yeah, but i need a single execution. if there possible with split the file CSV and zip all the files – Raj Mohan Sep 22 '17 at 18:28
  • Not sure what is meant by single execution. Just add millions of RAMs. :) – ficuscr Sep 22 '17 at 18:30
  • A better option is to run it in the background, and email the resulting file to the user..... is this really needed "on demand"? What are they doing with this csv file? (because it's too large to load into a spreadsheet program, so logically it is only useful for a csv importer to some other database/system – Mark Baker Sep 22 '17 at 18:31
  • @ficuscr, I'll try, is there any other way. is this help for me. http://php.net/manual/en/book.stream.php . I don't have idea about the stream – Raj Mohan Sep 22 '17 at 18:32
  • @MarkBaker, Many be they tried to move the data with different server or Database. – Raj Mohan Sep 22 '17 at 18:34
  • Just TEE or whatever. `cat`. You on Windows? – ficuscr Sep 22 '17 at 18:34
  • @ficuscr, No i'm using linux... – Raj Mohan Sep 22 '17 at 18:35
  • @RajMohan Perhaps they do.... you should know though, know what your customers want to use the data for rather than simply giving them a screen that will hog your servers and probably cripple their browser before it fails on them.... if they want this data to load into MS Excel, then MS Excel can't handle that volume of data, if they want to load it into an external system or database, then what's wrong with emailing the file? – Mark Baker Sep 22 '17 at 18:41
  • @MarkBaker, My client have some other logic for exported data. only selected user with some time of period only they need to access the files – Raj Mohan Sep 22 '17 at 18:47
  • You're stil better off submitting this volume as a background task – Mark Baker Sep 22 '17 at 19:07
  • Hi, I have new question, is there possible to export as excel file with 10M record – Raj Mohan Oct 17 '17 at 12:35

1 Answers1

0

Example try using this

SELECT *
INTO OUTFILE 'file.csv' -- Change the filename here
FIELDS
  TERMINATED BY ','
  OPTIONALLY ENCLOSED BY '"'
  ESCAPED BY '\\'
  LINES TERMINATED BY '\n'
FROM table

WHERE id BETWEEN 0 AND 10000000

Change the upper/Lower value as per required