-1

I'm migrating an old PHP project to a new Laravel app. There is a few million records table user_category_views which I was planning to migrate in chunks. Im getting the old records with mysqli and inserting with Laravel DB::Statement. For some reason after about a million records this code will fall with exception:

Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 73728 bytes)

What is overflowing the memory here? Maybe $result->free() doesn't work the way I think?

        $count = 2000000; // actual number will be received from count(*) stmt
        $vendors = [561 => '618', 784 => '512' /* and so on */ ];
        $step = 5000;
        for( $i=0; $i<=$count; $i+=$step ){

            $q = "SELECT * FROM `user_category_views` LIMIT $i, $step;";

            if ($result = $this->mysqli->query($q)) {

                $stmt = "INSERT INTO vendor_views (`vendor_id`, `counter`, `created_at`) VALUES";

                /* fetch associative array */
                while ($row = $result->fetch_assoc()) {

                    $vendor_id = null;

                    $id = $row['user_category_id'];

                    // Here I'm trying to prevent Laravel 
                    // from throwing the exception if the entry 
                    // is not found in $vendors array. 
                    // This habit I've gained from js coding  
                    try{
                       $vendor_id = $vendors[$id];
                    } catch (Exception $e) {
                       continue;
                    }

                    if(empty($vendor_id)) continue;

                    $counter = $row['counter'];
                    $created = $row['created'] ;

                    $stmt .= " ($vendor_id, $counter, '{$created}'),";

                }

                $result->free();

                DB::statement( trim($stmt, ",") );

                $stmt = null;

            }
        }
Shirker
  • 1,233
  • 2
  • 18
  • 30
  • What are you trying to do with the code `$vendor_id = $vendors[$id];` in a `try...catch...`? – Nigel Ren Nov 05 '19 at 06:50
  • Laravel will throw an exception if record not found in array $vendors[] – Shirker Nov 05 '19 at 06:52
  • Are you using Laravel 6.x? – IGP Nov 05 '19 at 06:55
  • 1
    If using PHP 7+, use something like `$vendor_id = $vendors[$id]??null;` or check the value exists first. – Nigel Ren Nov 05 '19 at 06:57
  • @IGP yep Laravel 6.x – Shirker Nov 05 '19 at 07:15
  • @NigelRen I agree, it looks nicer. anyway the main problem is the memory overflow. Im sure try/catch is not the cause – Shirker Nov 05 '19 at 07:21
  • It was just something that looked odd. As for your problem - perhaps looking at how it's done [on this question](https://stackoverflow.com/questions/12702812/bulk-insertion-in-laravel-using-eloquent-orm) may help. – Nigel Ren Nov 05 '19 at 07:22
  • @NigelRen hey man you were right about this try/catch thing. My bad. After I replaced to `$vendor_id = $vendors[$id]??null;` no more memory leak. I can increase the step to 20k+ and no more errors – Shirker Nov 05 '19 at 07:31

2 Answers2

0

Thanks to Nigel Ren, who pointed out that I'm misusing try/catch construction

So instead of

    .....
    try{
        $vendor_id = $vendors[$id];
    } catch (Exception $e) {
        continue;
    }
    ....

I'm using now

$vendor_id = $vendors[$id]??null;

And no more memory leak problem. Don't know, why though, but now it works

Shirker
  • 1,233
  • 2
  • 18
  • 30
-2

Your problem is because of your memory is limited. Try to set at the top of your PHP file:

ini_set('memory_limit', '-1');

To increase memory limit for PHP.

If above solution is not work. Maybe your RAM is go over. please check it. You also can reduce your step like $step = 5000; to $step = 1000; I think It's better

Hope it help. Thanks

Quân Hoàng
  • 371
  • 3
  • 9
  • The idea to use chunks is to be able to work with small amount of memory. I'm freeing the results after every 5k records – Shirker Nov 05 '19 at 07:24