0

While this may seem to be the right thing to do, I feel my code is very bad.

I'm running my Laravel 6.x app in a Docker container. When running the code below, I get

Allowed memory size of ** bytes exhausted (tried to allocate 8192 bytes)

No matter how high I set the memory_limit, it's the same error (with the new limit). So I want to review my code:

// I'm running a seeder.
$arr = [1,2,3,4,5,.....];

// Get all users and update a column:
$users = Users:all();

// Loop and update (we have thousands )
foreach ($users as $user) {
  $index = array_rand($arr);
  $user->someColumn = $arr[$index];
  $user->save();
}
// Another loop for another model with same as above.....

This is causing the "allowed memory" issue. Is there any better way to achieve this?

Ajith
  • 2,476
  • 2
  • 17
  • 38
Sylar
  • 11,422
  • 25
  • 93
  • 166
  • can you do this in database layer instead? of course you'll exhaust your memory if you iterate each and every user – Kevin Jul 17 '20 at 08:17
  • What was the `memory_limit` you set? Does your system have enough RAM? – sykez Jul 17 '20 at 08:18
  • Hi. What do you mean? Raw SQL? If so, how would I include the random item from the array? – Sylar Jul 17 '20 at 08:19
  • @Sylar do you really need the seed in PHP layer?, if all can be done in db layer, the better. – Kevin Jul 17 '20 at 08:27
  • @Kevin Could you show me an example how to include the array, select a random item from the array, should I do it in raw sql? linktoahref's method seems to be going just fine. SQL way seems more advance. – Sylar Jul 17 '20 at 08:37
  • if your seeder can generate one by one, consider batch update user, may be 100 rows each time – Gavin Kwok Jul 17 '20 at 08:43

1 Answers1

2

Instead of fetching all users at once, fetch chunk of users

\App\User::chunk(10, function ($users) use ($arr) {
    foreach ($users as $user) {
        $index = array_rand($arr);
        $user->someColumn = $arr[$index];
        $user->save();
    }
});
linktoahref
  • 7,812
  • 3
  • 29
  • 51