I have a Laravel application that has about 52 million records on a one table serials table below.
Schema::create('serials', function (Blueprint $table) {
$table->id();
$table->bigInteger('pinNumber');
$table->bigInteger('serialNumber');
$table->boolean('checked')->default(0);
$table->boolean('status')->default(0);
$table->string('lotNumber')->nullable();
$table->Integer('checkCode');
$table->index(['serialNumber','pinNumber']);
$table->softDeletes();
$table->timestamps();
});
I am planning to have about 100 million records but the web is extremely very slow with the current 52 million records.
My insertions( autogenerated serials) is working fine but getting the counts as per below is taking more time than expected.
$totalSerials = Serial::max('id');
$totalDownload = Lot::sum('count');
$appovedCodes = Serial::where('checked', true)->count();
Please advise on the best way to handle big data with laravel. I have a sytem with 8gb ram and 160gb ssd.