currently, I'm getting a 2-dimensional PHP array of data by downloading a csv file from an URL and parsing it with str_getcsv()
.
After that, I loop through each line and insert the values to the corresponding SQL columns:
foreach($array as $a=>$value){
//defining variables
$sql = "INSERT INTO Ranking (ra, na, sc, kd, wi, ki, de, sh, ti)
VALUES ('$ra','$na','$sc','$kd','$wi','$ki','$de','$sh','$ti')";
$conn->query($sql);
}
$conn->close();
I have about 16,000 lines of data but weirdly, the insertion stops after exactly 10,999 rows. Thats about 929 kb.
I can't find a reason since my hosting provider states that there is no row limit. I don't think that the execution of the PHP file stops due to an execution time limit because after this insertion command the array also gets stored locally and that happens about 4 seconds after the file gets downloaded.
Does the foreach()
-loop have some kind of a time limit? Does it stop after a certain number of executions? Would it work if I only connected once with just ONE sql query?
UPDATE:
Turns out the sql import time exceedet.
Fatal error: Maximum execution time of 10 seconds exceeded
in /home/mainhost/public_html/phpMyAdmin/libraries/import/sql.php on line 118
(output of manual insertion try in phpmyadmin)
What do i do? split the input to two queries?