I need to run a daily cron job that iterates over a 6 MB CSV file to insert each of the ~10,000 entries into a MySQL table. The code I have written hangs and produces a timeout after a while.
if (($handle = fopen($localCSV, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$dbdata = array(
'SiteID' => $siteID,
'TimeStamp' => $data[0],
'ProductID' => $data[1],
'CoordX' => $data[2],
'CoordY' => $data[3]
);
$row++;
$STH = $DBH->prepare("INSERT INTO temp_csv (SiteID,TimeStamp,ProductID,CoordX,CoordY) VALUES (:SiteID,:TimeStamp,:ProductID,:CoordX,:CoordY)");
$STH->execute($dbdata);
}
fclose($handle);
echo $row." rows inserted.";
}
It would have been ideal to use mysql_*
functions instead of PDO, so I could implode the values into one single query (although huge) but unfortunately I need to comply with some guidelines (PDO to be strictly used).
I searched SO and there are very similar questions but none could solve mine. What I tried is the following:
1- Ran LOAD DATA INFILE
and LOAD DATA LOCAL INFILE
queries but kept getting "file not found" errors although the file is definitely there with 777 permissions. The DB server and the shared hosting account are in different environments. I tried relative and url paths to the csv file but no luck (couldn't find the file in both cases).
2- I split the csv file into 2 files and ran the script on each, to see the threshold at which the script hangs, but it inserted the entries twice in the table in the case of each file.
I don't have access to php.ini
since it's a shared hosting account (cloudsites) and only access to MySQL
through phpMyAdmin.
What else can I try to accomplish this as efficiently as possible?
Any help is appreciated.