0

I'm required to download a large XML file from a remote FTP server to local storage so that I can process it.

I've defined an FTP driver that can access the file. However, because of the size of the file, PHP gives up allocating memory for the operation.

Storage::disk('ftp')->get('path/to/file/bigass.xml');

Is there any way that doesn't eat up memory and can download file without issues?

TheBigK
  • 233
  • 4
  • 16

1 Answers1

0

I suggest to switch to a "plain old" curl solution, using something like this:

$curl = curl_init();
$fh   = fopen("localfilename.xml", 'w');
curl_setopt($curl, CURLOPT_URL, "ftp://{$ftp_username}:{$ftp_password}@{$ftp_server}/path/to/file/bigass.xml");
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($curl);
fwrite($fh, $result);
fclose($fh);
curl_close($curl);
Simonluca Landi
  • 931
  • 8
  • 21