I have a been having problems opening and reading the contents of a 2gb csv file. Everytime I run the script it exhausts the servers memory (10GB VPS Cloud Server) and then gets killed. I have made a test script and was wondering if anyone could have a look and confirm that I am not doing anything silly (php wise) here that would cause what seems and unsually high amount of memory usage. I have spoken to my hosting company but they seem to be of the opinion that it is a code problem. So just wondering if anyone can look over this and confirm there is nothing in the code that would cause this kind of problem.
Also if you deal with 2GB csvs, have you encounted anything like this before ?
Thanks
Tim
<?php
ini_set("memory_limit", "10240M");
$start = time();
echo date("Y-m-d H:i:s", $start)."\n";
$file = 'myfile.csv';
$lines = $keys = array();
$line_count = 0;
$csv = fopen($file, "r");
if(!empty($csv))
{
echo "file open \n";
while(($csv_line = fgetcsv($csv, null, ',', '"')) !== false)
{
if($line_count==0) {
foreach($csv_line as $item) {
$keys[] = preg_replace("/[^a-zA-Z0-9]/", "", $item);
}
} else {
$array = array();
for ($i = 0; $i <count($csv_line); $i++) {
$array[$keys[$i]] = $csv_line[$i];
}
$lines[] = (object) $array;
//print_r($array);
//echo "<br/><br/>";
}
$line_count++;
}
if ($line_count == 0) {
echo "invalid csv or wrong delimiter / enclosure ".$file;
}
} else {
echo "cannot open ".$file;
}
fclose ($csv);
echo $line_count . " rows \n";
$end = time();
echo date("Y-m-d H:i:s", $end)."\n";
$time = number_format((($end - $start)/60), 2);
echo $time."\n";
echo "peak memory usages ".memory_get_peak_usage(true)."\n";