I use XMLStreamer for reading the XML file (Structure here).
There are many products in 1 XML file (about 100,000 per file). The products have parameters, in total there are up to 500,000 of them in the file. There will be even around 20 XML files (maybe more, each with different products). Will the database be a good place to store such data?
Main problem:
The problem is the execution time of the script that enters the data into the database. Now with around 35,000 products, the time is around 300s. How to optimize it?
Code:
try {
$start = time();
$mysqli->begin_transaction();
$stmtProducts = $mysqli->prepare("INSERT INTO products (shop_id, product_id, product_name) VALUES(?, ?, ?);");
$stmtProducts->bind_param("iis", $shopId, $productId, $productName);
$streamer = new XMLStreamer('offer', 'products', 'product');
foreach ($streamer->stream('bds.xml') as $product) {
$document = new \DOMDocument();
$document->appendChild($product);
$element = simplexml_import_dom($product);
$productId = $element->attributes()->id;
$productName = $mysqli->real_escape_string($element->description->name);
$stmtProducts->execute();
}
$stmtProducts->close();
$mysqli->commit();
var_dump("Time [s]: " . (time() - $start));
} catch (mysqli_sql_exception $exception) {
$mysqli->rollback();
throw $exception;
}