0

I am using a console command to download some data locally and than dispatch an update job from that data. The issue I'm having is that the data downloaded is around 65MB for now. The line Storage::disk('local')->put($name, $content); specifically throws a php fatal error: allowed memory size of 134217728 bytes exhausted since I assume the put method creates a copy of $content going beyond 128MB.

Is there a way around this other than setting the memory limit to say 256MB? Can I store this data in chunks maybe? I am not interested in working on the chunks themselfs. Is there some Laravel method that takes the reference &$contents to store the data?

I would prefer a "Laravel" solution if possible.

$name = basename(config('helper.db_url'));
$content = file_get_contents(config('helper.db_url'));

Storage::disk('local')->put($name, $content);

UpdatePostsTable::dispatch();

Log::info("Downloaded $name");
kris gjika
  • 531
  • 2
  • 8
  • 1
    do not use `file_get_contents()` to download stuff. It is not right tool for the job. Better to use designed utils: [Laravel's Http Client](https://laravel.com/docs/9.x/http-client), [PSR-18 Clients](https://www.php-fig.org/psr/psr-18/), Guzzle, cURL etc. – Marcin Orlowski Apr 04 '22 at 20:58
  • Thanks I knew I was missing something. I shouldn't have gone for those old php tutorials in the first place. – kris gjika Apr 05 '22 at 05:38

0 Answers0