0

I have a code here to generate an excel file if the submit button is clicked. The file is populated with data from mysql and of course is based from a mysql query. Here it is:

<?php
require("aacfs.php");

header("Content-type: application/ms-excel");
header("Content-Disposition: attachment; filename=Reservation Summary_sortbydate.xls");
header("Pragma: no-cache");
header("Expires: 0");

$head1="Ayala Aviation Corporation";
$head2="RESERVATION SUMMARY";
$head3="For the period ___________";


$heads="$head1\n$head2\n$head3\n";

$query = "select bdate as 'Date', cliename as 'Client', grpcode as 'Group Code', bperson as 'Contact', reservno as 'Reservation No.', acode as 'Aircraft', fdate as 'Flight Date', itinerary as 'Itinerary', etd as 'ETD', eta as 'ETA', pname as 'Passengers', status as 'Status', cutoff as 'Confirmation Cut-off', adminid as 'Reserved by' from reservation order by bdate";

$result = mysql_query($query);
if($result) {
    $count = mysql_num_rows($result);
   for($i=0; $i<$count; $i++) {
   for ($i = 0; $i < mysql_num_fields($result); $i++) 
    { 
    $schema_insert_rows.=mysql_field_name($result,$i) . "\t"; 
    } 
    $schema_insert_rows.="\n"; 

        while($row = mysql_fetch_row($result)) {
            $line = '';
            foreach($row as $value) {
                if((!isset($value)) OR ($value == "")) {
                    $value = "\t";
                } else {
                    $value = str_replace('"', '""', $value);
                    $value = '"'.$value.'"'."\t";
                }
                $line .= $value;
            }
            $data .= trim($line)."\n";
        }
        $data = str_replace("\r", "", $data);
        if($data == "") {
            $data = "\n(0) Records Found!\n";
        }
    }
    print mb_convert_encoding("$heads\n$schema_insert_rows\n$data", 'UTF-16LE', 'UTF-8');
} else die(mysql_error());
?>

It takes long for it to execute and when it is finally downloaded on my computer, the file displays an error message: 'Maximum execution time of 60 seconds exceeded'. I tried setting the time limit like this: set_time_limit(0) but it still take a long time before I can get to download the file and when I opened the downloaded file, another error displays again on the file itself: 'Allowed memory size of 134217728 bytes exhausted (tried to allocate 133693422 bytes)'

I have this exact code but different mysql_query running on another php file so I'm really frustated as to why this doesn't work. I have this working smoothly the other day but when I checked it again today, it shows this error. What am I doing wrong? Any help would be appreciated. Thank you in advance! GOD bless!

Jocelyn
  • 11,209
  • 10
  • 43
  • 60
xjshiya
  • 915
  • 7
  • 16
  • 44
  • Use fputcsv() rather than creating your own tab-separated strings. The built-in function is faster than your home-made method. – Mark Baker Aug 17 '12 at 06:27
  • Write data a row at a time to a temp file rather than building it all up in memory, then just stream that file to your php://output and delete it afterwards – Mark Baker Aug 17 '12 at 06:28

1 Answers1

0

Did you try fetching 10 data and see if they have same results or error? I think the data that you are trying to export is kinda a heavy that's why you encounter the execution timeout.

Update your query with this.

$query = "
select 
    bdate as 'Date', 
    cliename as 'Client', 
    grpcode as 'Group Code', 
    bperson as 'Contact', 
    reservno as 'Reservation No.', 
    acode as 'Aircraft', 
    fdate as 'Flight Date', 
    itinerary as 'Itinerary', 
    etd as 'ETD', 
    eta as 'ETA', 
    pname as 'Passengers', 
    status as 'Status', 
    cutoff as 'Confirmation Cut-off', 
    adminid as 'Reserved by' 
from 
    reservation 
group by 
    bdate 
LIMIT 10";

update your php.ini

     post_max_size = 8M
     default_socket_timeout = 60
Fluffeh
  • 33,228
  • 16
  • 67
  • 80
Philip F
  • 86
  • 8
  • I tried changing that but it still doesn't work. :(( The data I'm trying to retrieve as of now contains 20 rows only plus the columns I'm calling on my query. – xjshiya Aug 17 '12 at 05:58
  • I tried limiting the data to only 10 as you said and yes, it did work properly. But I really need to fetch those data. What should I do? – xjshiya Aug 17 '12 at 06:02
  • Go to your mysql.ini and change the configuration: Try this: [myisamchk] key_buffer_size = 20M sort_buffer_size = 20M read_buffer = 2M write_buffer = 2M – Philip F Aug 17 '12 at 07:15
  • I changed the configuration like you instructed and removed the limit I previously put on my code but it changed nothing. It took time for the file to download and the maximum execution error is still present when I open the file. :(( – xjshiya Aug 17 '12 at 09:11
  • Here's my php.ini.. post_max_size = 128M, default_socket_timeout = 60.. I changed modified it as you said and removed the limit on my mysql query (though I highly doubt that it won't fix the problem since my php.ini was set higher..) and yeah, i was right. it still has the same problem.. :((( – xjshiya Aug 23 '12 at 00:12