I'm migrating an old PHP project to a new Laravel app. There is a few million records table user_category_views
which I was planning to migrate in chunks. Im getting the old records with mysqli
and inserting with Laravel DB::Statement
. For some reason after about a million records this code will fall with exception:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 73728 bytes)
What is overflowing the memory here? Maybe $result->free()
doesn't work the way I think?
$count = 2000000; // actual number will be received from count(*) stmt
$vendors = [561 => '618', 784 => '512' /* and so on */ ];
$step = 5000;
for( $i=0; $i<=$count; $i+=$step ){
$q = "SELECT * FROM `user_category_views` LIMIT $i, $step;";
if ($result = $this->mysqli->query($q)) {
$stmt = "INSERT INTO vendor_views (`vendor_id`, `counter`, `created_at`) VALUES";
/* fetch associative array */
while ($row = $result->fetch_assoc()) {
$vendor_id = null;
$id = $row['user_category_id'];
// Here I'm trying to prevent Laravel
// from throwing the exception if the entry
// is not found in $vendors array.
// This habit I've gained from js coding
try{
$vendor_id = $vendors[$id];
} catch (Exception $e) {
continue;
}
if(empty($vendor_id)) continue;
$counter = $row['counter'];
$created = $row['created'] ;
$stmt .= " ($vendor_id, $counter, '{$created}'),";
}
$result->free();
DB::statement( trim($stmt, ",") );
$stmt = null;
}
}