1

I am extracting loads of data about 2000000 records. When i am fetching data, i get 500 internal error due to memory size. However, i tried using chunk to get the data in bits of about 1000 each returned to the client.

But in my code below, i am still getting 500 internal error. What am i doing wrong.

I want the data to be returned in bits for the user and not overload the memory

Controller

public function getSpecifiedPeriod($user, $from, $to)
    {
        $connection = DB::connection('database');
        $query = $connection->table('items')->where('user_id', $user)->whereBetween('date_time', [$from, $to])->get()->chunk(100);
        if ($query->isEmpty())
            return false;
        return $query;
    }
RoboPHP
  • 410
  • 4
  • 12
  • 1
    fetching 2,000,000 datasets in one go is never a good idea. try pagination and/or use ajax to fetch chunks one by one. – rx2347 Mar 16 '20 at 15:32
  • @rx2347, with pagination, user might not get all the data since it is paginated – RoboPHP Mar 16 '20 at 15:33
  • Simply use SQL `LIMIT` to extract a range of rows in each query, until no more rows are returned or less than the maximum number of rows is returned. It doesn't matter in the slightest how many rows are in the table. – Mike Robinson Mar 16 '20 at 17:39

0 Answers0