My file (126 MB size, .exe) is giving me issues.
I'm using the standard laravel download method.
I tried increasing the memory but it still either says I have run out of memory, or I download a 0 KB size file.
The documentation doesn't mention anything about large file sizes.
My code is
ini_set("memory_limit","-1"); // Trying to see if this works
return Response::download($full_path);
Anything I am doing wrong?
-- Edit --
Going on Phill Sparks comment, this is what I have and it works. It's a a combinations of Phill's plus some from php.net. Not sure if there is anything in there missing?
public static function big_download($path, $name = null, array $headers = array())
{
if (is_null($name)) $name = basename($path);
// Prepare the headers
$headers = array_merge(array(
'Content-Description' => 'File Transfer',
'Content-Type' => File::mime(File::extension($path)),
'Content-Transfer-Encoding' => 'binary',
'Expires' => 0,
'Cache-Control' => 'must-revalidate, post-check=0, pre-check=0',
'Pragma' => 'public',
'Content-Length' => File::size($path),
), $headers);
$response = new Response('', 200, $headers);
$response->header('Content-Disposition', $response->disposition($name));
// If there's a session we should save it now
if (Config::get('session.driver') !== '')
{
Session::save();
}
// Below is from http://uk1.php.net/manual/en/function.fpassthru.php comments
session_write_close();
ob_end_clean();
$response->send_headers();
if ($file = fopen($path, 'rb')) {
while(!feof($file) and (connection_status()==0)) {
print(fread($file, 1024*8));
flush();
}
fclose($file);
}
// Finish off, like Laravel would
Event::fire('laravel.done', array($response));
$response->foundation->finish();
exit;
}