9

I have the functionality to import a very large amount of records around 5 million.

I have to make entries for related tables also simultaneously in the import process.

I have to make bunch insert queries for new entries and taking care of all queries and also making processes in chunks.

What are the other ways to speed up the process?

Vivek Solanki
  • 136
  • 1
  • 13

3 Answers3

5

So to summarize for people who don't bother looking through all comments separately:

  • Use the DB facade Laravel offers instead of Eloquent models for improved performance.
  • Run the import process in batches rather than importing all data at once.
  • Disble query logging to save memory usage.
  • Run the import in a background job, not during a web request.

Besides the points already made you could consider:

  • First run a job which reads your input xlsx/csv file in batches of 50 lines (or some other sensible number). Make sure you do not keep all lines in memory. Then for each batch of lines create a new job. So you'll have a 2-step import.
  • When creating separate jobs for each batch, you can run those jobs concurrently (= multiple queue workers at the same time).
  • If you have users waiting for the import to be finished, make sure you show a progress bar or at least an animated loader of some kind. Although this does not speed up the process, it will give them an indication work is being done.
  • You could also leverage database transactions for running multiple database queries at once (credit to Philippe Thiers for this one)
PtrTon
  • 3,705
  • 2
  • 14
  • 24
  • I have tried the Queue for my process. but is do not run simultaneously, It executes one after another. – Vivek Solanki Jun 15 '19 at 12:22
  • You'll have to run multiple commandlines at once for this to do so. In a production environment [Supervisor is advised](https://laravel.com/docs/5.8/queues#supervisor-configuration). This is a tool which keeps certain processes (in your case the queue worker) running at all times. If a job fails, errors or runs out of memory and stops it will be rebooted by supervisor. The configuration for supervisor allows you to specify `numprocs`, which is the number of simultaneous processes. – PtrTon Jun 15 '19 at 13:49
  • 1
    Add to the list, the use of transaction, `DB::transaction(closure);` or `\DB::beginTransaction; try{ /* code */ \Db::commit; } catch(\Exception $e){ \Db::rollback; } ` – Philippe Thiers Jun 17 '19 at 21:43
4

(copied from laracasts) This will probably help too:

DB::connection()->disableQueryLog();

"By default, Laravel keeps a log in memory of all queries that have been run for the current request. However, in some cases, such as when inserting a large number of rows, this can cause the application to use excess memory."

manu
  • 351
  • 2
  • 15
  • 1
    This is for `Laravel <= 4.2`, on the later version it's disabled by default... so it won't impact with `laravel 5.8` as mentioned in question requirements. – Vikash Pathak Jun 17 '19 at 11:09
0

Use:

Process data in Chunks Use Laravel queues

Use https://docs.laravel-excel.com/3.1/imports/ Example for user model binding

namespace App\Imports;

use App\User;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\Importable;

class UsersImport implements ToModel
{
    use Importable;

  public function model(array $row)
  {
      return new User([
       'name'     => $row[0],
       'email'    => $row[1],
       'password' => Hash::make($row[2]),
    ]);
  }
}

In Controller

(new UsersImport)->import('users.xlsx', 'local', \Maatwebsite\Excel\Excel::XLSX);
bhavinG
  • 19
  • 3
  • 1
    please provide the relevant code from the link related to question. Avoid external links so that even if the link die.. we have the info in the answers – Himanshu Bansal Jun 24 '19 at 11:54
  • Laravel excel is not so flexible for large data. I have already tried laravel excel. – Vivek Solanki Jun 25 '19 at 04:40
  • I am using it for bulk processing it have many new features like transaction,queues,chunks processing, Model Binding which is most significant feature for validation and insertion also provide custom integration. i agree excel is not so flexible but laravel-excel make it easier for you – bhavinG Jun 25 '19 at 04:52
  • @VivekSolanki which version you have used? can you share any problem statement so i can share my feedback. – bhavinG Jun 25 '19 at 04:55
  • @VivekSolanki If speed is the issue than you should go for CSV instead , however model binding with help to achieve better speed. – bhavinG Jun 25 '19 at 09:57