I am using a console command to download some data locally and than dispatch an update job from that data. The issue I'm having is that the data downloaded is around 65MB for now. The line Storage::disk('local')->put($name, $content);
specifically throws a php fatal error: allowed memory size of 134217728 bytes exhausted since I assume the put method creates a copy of $content
going beyond 128MB.
Is there a way around this other than setting the memory limit to say 256MB?
Can I store this data in chunks maybe? I am not interested in working on the chunks themselfs. Is there some Laravel method that takes the reference &$contents
to store the data?
I would prefer a "Laravel" solution if possible.
$name = basename(config('helper.db_url'));
$content = file_get_contents(config('helper.db_url'));
Storage::disk('local')->put($name, $content);
UpdatePostsTable::dispatch();
Log::info("Downloaded $name");