0

I have to read a large json file of approximate 500MB from SFTP. I am using laravel storage to read the file. Firstly I download the file on my local storage/app/public folder then read file from my local folder line by line and append the file content in a variable. After that I decode json data and save each record in database. But I am getting memory exhausted error and even max timeout error sometime. I have increased memory limit 1024M and execution time to 600 using php ini functions and tried many solutions but none of them is working. Following is my code:

        $stream = Storage::disk('incoming_feed_server')->getDriver()->readStream($fileName);

        while(ob_get_level() > 0) ob_end_flush();

        //save file onto local system
        Storage::disk('public')->put($newFileNAmeWithDate, $stream); 

        $handle = fopen(storage_path('app/public/'.$newFileNAmeWithDate), "r") or die("Couldn't get handle");

        $content = '';
        if ($handle) {
            while (!feof($handle)) {
                $content.= fgets($handle, 4096);                    
            }
            fclose($handle);
        }  

Please let me know proper solution of this problem. Thanks in advance!

Gaganpreet Kaur
  • 371
  • 1
  • 3
  • 15
  • 1
    Why would you want to put a 500mb into a variable? That doesn't make any sense. Are you going to increase time/memory limits indefinitely? – Mike Doe Apr 01 '20 at 10:05

1 Answers1

-3

Timeout should be an error caused for PHP configuration. In Laravel controller, you can use set_time_limit(0); before the file operation. Please see: How to solve a timeout error in Laravel 5

Vicent
  • 9
  • 1