I have the following code, which I would like to be more efficient in performing inserts in to the db and also update the code to now use PDO:
foreach ($urls as $i => $url) {
//path to csv file
$web = "http://path-to-file/".$url;
$ch = curl_init();
curl_setopt($ch, CURLOPT_COOKIEJAR, "cookies.txt");
curl_setopt($ch, CURLOPT_COOKIEFILE, "cookies.txt");
curl_setopt ($ch, CURLOPT_URL, $web);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_TIMEOUT, 30);
$result = curl_exec($ch);
//explode the csv file by new line
$r = explode("\n", $result);
$sql = "insert into product (`product_name`) values ";
for ($i = 1; $i <= count($r); $i++){
//explode each row by comma
$n = explode(",", $r[$i]);
$product_name = $n[0];
$sql .= "('$product_name'),";
}
$sql = trim($sql, ",");
if(count($n) != 0)
mysql_query($sql);
}
Firstly, there is a loop which data from a remote file is retrieved and inserts all this data from the file into the database. At the moment the code inserts the data from one file and then moves on to the next file. Each file can have up to 2000 inserts, and usually done in batches of 15 (so rough total of records inserted is 30000). Would it be more efficient on the database to insert all 30000 records in one hit or, in batches? (bearing in mind there may be multiple users on the site also retrieving files and performing inserts).
Secondly I would like to modify the code above to use PDO.
Any help with this will be appreciated.
Thanks
PS. I do know how to create the PDO db connection.