1

Im using the following code to download a big file like 250mb:

if(file_exists($leadon)) 
    {
        set_time_limit(0);
        header('Content-Description: File Transfer');
        header('Content-Type: application/octet-stream');
        header('Content-Disposition: attachment; filename='.$file_name);
        header('Content-Transfer-Encoding: binary');
        header('Expires: 0');
        header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
        header('Pragma: public');
        header('Content-Length: ' . filesize($leadon));
        header('Content-Description: File Transfer');
        ob_clean();
        flush();
        readfile($leadon);
        exit;
    }
    die();

But the file is partially downloaded like 18mb to 75mb depending on the speed of the internet. The server administrator told me that Im triggering 500 error code (Internal Error) and the server security considers its under attack and blocks my IP, Please help me on this, what could cause triggering an internal error or security breach?

I copied it into a temp directory and let user download by header(redirect) and removed the files that are older than 24 hours then the page reloads and any other file download is requested. I have given the code below, please let me know if there is any better solution.

if(file_exists($leadon)) 
        {
            $temp_folder    =   time();
            $olderTime      =   $temp_folder-(24*60*60);
            $temp_dir   =   $tempdir.$temp_folder;
            create_dir($temp_dir, 0775);
            $dirs   =   glob($tempdir.'*'); // get all file names
            foreach($dirs as $dir)
                { // iterate files
                if(is_dir($dir)) 
                    {
                    $temp_dir_name  =   explode('/',$dir);
                    $dir_name       =   $temp_dir_name['1'];
                    if(intval($dir_name) < intval($olderTime))
                        {
                        deleteDir($dir);
                        }
                    }
                }
            set_time_limit(0); //Set the execution time to infinite.
            $file_name  =   str_replace(' ', '_',  basename($leadon));
            // 
            $temp_file  =   $temp_dir."/".$file_name;
            copy($leadon, $temp_file);
            header('Location: '.$doman_name.'/'.$temp_file);
            exit();
        }
ahmed
  • 498
  • 6
  • 13
  • 1
    have you checked the server config for timeouts or file limitation? – Soundz Aug 29 '12 at 07:49
  • 2
    sounds like php is timing out – Laurence Aug 29 '12 at 07:49
  • Which timeout variables should I change and where? I have set socket time out to 0. and desperately made the following changes in php.ini as well: max_execution_time = 0; max_input_time = 0; memory_limit = 10240M; – ahmed Aug 29 '12 at 07:59

1 Answers1

0

Why you are writing a file as output using PHP ? How about keeping the file somewhere in the web server's directory and redirecting the user to the same ? Let your web server do the task of serving the file isn't it ?

mrd081
  • 279
  • 2
  • 11
  • actually Im sending a shard link to the user to download the file from my repository so I cannot have that file path displayed to the user. – ahmed Aug 29 '12 at 09:09
  • 1
    I still suggest you use some temporary directory to hold that file and delete that folder sometime later. As you describe the file is large, you are locking one PHP process for that much time which is not a good idea. – mrd081 Aug 29 '12 at 09:25
  • Yes I copied it in the temp and let user download by header(redirect) and removed the files that are older than 24 hours then the page reloads and any other file download is requested. I have given the code above, please let me know if there is any better solution. – ahmed Aug 30 '12 at 11:26