0

I have columns in MySQL table -> id, source, title, description, link and i have big PHP array with more than 1000 elements inside containing information wich i need to insert/update into database. I don't need to use PHP json_encode/decode functions and save info in json format

I tried to insert/update rows using while/for loop but it seems inefficient not a good practice at all

ERROR: Fatal error: Maximum execution time of 120 seconds exceeded in

$query = "INSERT INTO table (id,source,title,description,link) VALUES (?,?,?,?,?)";

i = 0;
$count = (int)count($arr);

while(i < $count){

    $stmt = $cn->prepare($query);
    $stmt->bind_param("issss",
        $arr["id"][$i],
        $arr["source"][$i],
        $arr["title"][$i],
        $arr["description"][$i],
        $arr["link"][$i]
    );
    $stmt->execute();

    ++$i;
}

Question: Is it possible to insert/update multiple rows once? and if yes how? Thanks.

redevil
  • 155
  • 7
  • https://www.mysqltutorial.org/mysql-insert-multiple-rows/ – georgecj11 May 17 '22 at 12:49
  • @Alex No. I do not need to use loops no matter it is foreach for or while loop – redevil May 17 '22 at 12:52
  • You could write all `values` as CSV, then bind them all. With this approach `$stmt = $cn->prepare($query);` should be outside the loop – user3783243 May 17 '22 at 12:55
  • Something like `$query = "INSERT INTO table (id,source,title,description,link) VALUES ("; $query .= implode(',', array_fill(0, count($arr), '(?,?,?,?,?)')) . ')';` for the former approach, will then need to map the bindings which might be hard with mysqli – user3783243 May 17 '22 at 13:01
  • I would find out the maximum number of queries the system is able to "eat" per time and sliced the array into pieces and would execute these pieces one by one. – Eugene Kapustin May 17 '22 at 16:08

0 Answers0