I have to read a large json file of approximate 500MB from SFTP. I am using laravel storage to read the file. Firstly I download the file on my local storage/app/public folder then read file from my local folder line by line and append the file content in a variable. After that I decode json data and save each record in database. But I am getting memory exhausted error and even max timeout error sometime. I have increased memory limit 1024M and execution time to 600 using php ini functions and tried many solutions but none of them is working. Following is my code:
$stream = Storage::disk('incoming_feed_server')->getDriver()->readStream($fileName);
while(ob_get_level() > 0) ob_end_flush();
//save file onto local system
Storage::disk('public')->put($newFileNAmeWithDate, $stream);
$handle = fopen(storage_path('app/public/'.$newFileNAmeWithDate), "r") or die("Couldn't get handle");
$content = '';
if ($handle) {
while (!feof($handle)) {
$content.= fgets($handle, 4096);
}
fclose($handle);
}
Please let me know proper solution of this problem. Thanks in advance!