1

There's a question here (on StackOverflow) that asks about streaming large files to user in chunks. A referred to code, originally here, in an answer to this question tells how. I'm looking for how to just save the file to server.

Notes:

  • The script here aims to download a file to server by providing URL to download file from (This process is also named remote upload).
  • My server provider disabled me from editing time limit, so downloads using this script takes time.
  • I am able to save file contents to server using file_put_contents("MyFile.iso",$buffer,FILE_APPEND), but not the whole file, mostly because the script takes long time running so it times out.
  • I think a solution may work like so: a JavaScript method requests PHP actions in the background via AJAX multiple times, the first background request tells PHP to download the first 100MB of the file. The second request tells PHP to download the second 100MB of the file, and so on till the PHP tells Javascript that we reached to the end of the file. So instead we downloaded the file in one whole process (long time taking), we downloaded it on multiple processes (small time taking).

Below is the mentioned code that I need to start with in order to save/remote-upload file to server: (edited: it now saves the file to server, but not the whole file, mostly because the script takes long time running)

<?php
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk

// Read a file and display its content chunk by chunk
function readfile_chunked($fileurl, $retbytes = TRUE) {
    $buffer = '';
    $cnt    = 0;
    $handle = fopen($fileurl, 'rb');

    if ($handle === false) {
        return false;
    }

    while (!feof($handle)) {
        $buffer = fread($handle, CHUNK_SIZE);
        file_put_contents("MyFile.iso",$buffer,FILE_APPEND);
        ob_flush();
        flush();

        if ($retbytes) {
            $cnt += strlen($buffer);
        }
    }

    $status = fclose($handle);

    if ($retbytes && $status) {
        return $cnt; // return num. bytes delivered like readfile() does.
    }

    return $status;
}

$fileurl = 'http://releases.ubuntu.com/16.04.2/ubuntu-16.04.2-desktop-amd64.iso';
$mimetype = 'mime/type';
header('Content-Type: '.$mimetype );
readfile_chunked($fileurl);

?>
Community
  • 1
  • 1
Omar
  • 6,681
  • 5
  • 21
  • 36
  • So how are you sending the file to the server? And how are you chunking it on the way? – RiggsFolly Mar 07 '17 at 18:13
  • Have you read this on how to upload files to the server [Its in the PHP manual](http://php.net/manual/en/features.file-upload.php) – RiggsFolly Mar 07 '17 at 18:14
  • I need to download file from a URL to server – Omar Mar 07 '17 at 18:16
  • That woudl have been useful information to put **in your question** – RiggsFolly Mar 07 '17 at 18:17
  • Why does it have to be chunked specifically. What would be wrong with doing a simple `file_get_contents()` – RiggsFolly Mar 07 '17 at 18:18
  • @RiggsFolly Because when downloading large files this will lead to memory run out. – Omar Mar 07 '17 at 18:25
  • Instead I suggest using plugin - http://www.albanx.com/ajaxuploader/. I used in many projects that have 2MB max upload limit, it can upload any size of files – Ananth Mar 07 '17 at 18:26
  • @user3367928 I think it doesn't support remote upload (upload file to server by just providing url) – Omar Mar 07 '17 at 18:29
  • This plugin also supports PHP, so you can check how this plugin works – Ananth Mar 07 '17 at 18:35
  • @user3367928 I can't find a way to give the plugin the url in order to download to my server. It seems supporting only direct uploads. – Omar Mar 07 '17 at 20:46
  • Possible duplicate of [Downloading a large file using curl](http://stackoverflow.com/questions/6409462/downloading-a-large-file-using-curl) – Pieter van den Ham Mar 07 '17 at 23:39
  • @Pete My question is about editing the mentioned specific code within the question post in order to save the file without using totally different method other than curl. Why? Because curl method produced time limit error (where my server provider disabled changing it), while I think this algorithm seems to work if we used some AJAX-like workaround. So the question is specifically about the mentioned code within the question. – Omar Mar 08 '17 at 00:16
  • @Omar If there is a timeout error, no "workaround" will be able to "work around" that. That file has to be downloaded/uploaded to the server either way. Maybe switch to a different provider or VPS. – Pieter van den Ham Mar 08 '17 at 00:25
  • @Pete What about a JavaScript method to request PHP actions in the background via AJAX multiple times, the first background request tells PHP to download the first 100MB of the file. The second request tells PHP to download the second 100MB of the file, and so on till the PHP tells Javascript that we reached to the end of the file, so instead we downloaded the file in one whole process (long time taking), we downloaded it on multiple processes (small time taking). – Omar Mar 08 '17 at 10:09

1 Answers1

0

I knew this question is old, but I hope my answer help others.

The question is not actually 'Download Large File To Server', but it should be Upload Large File To Server instead.

To upload a large file to a server, there are many ways. Here's is how I do it using FileReader and XMLHttpRequest.

https://stackoverflow.com/a/49808460/6348813

The idea is, you need to read the file as a Binary (ReadAsBinaryString()) or ArrayBuffer (ReadAsArraBuffer()) then you can stream the file to a server. In PHP, to listen the streamed Binary or ArrayBuffer is simply using php://input as the listen directory.

You need to consider doing this under HTTPS or your connection will be open to attacker.

The other method, you may try upload the file using the slice() method. This method has some good performance as it support the pause, close and resume upload whenever the connection is not stable.

In my experience, the ReadAsArrayBuffer() method seems more faster than ReadAsBinaryString() method and slice() method even the connection is under 100kbps.

All above methods has some common features, it is, 'You don't need to setting your PHP upload limit'.

Note:

DO NOT USE ReadAsBinaryString() METHOD AS IT HAS BEEN DEPRECATED IN MOZILLA AND DO NOT SUPPORT LARGE FILE MORE THAN 300MB.

This feature is non-standard and is not on a standards track. Do not use it on production sites facing the Web: it will not work for every user. There may also be large incompatibilities between implementations and the behavior may change in the future.

Original Article

https://developer.mozilla.org/en-US/docs/Web/API/FileReader

https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest

Community
  • 1
  • 1
Mr Hery
  • 829
  • 1
  • 7
  • 25
  • I think I need to make my question more clear. I'm not trying to upload a file that I already have stored on my machine to the server. I'm trying to upload a file that is stored on a remote server and I have access to via a direct link like this: `http://releases.ubuntu.com/16.04.2/ubuntu-16.04.2-desktop-amd64.iso`. What I want is to copy directly that file (that doesn't exist on my machine) to my server without downloading the file to my machine first, then uploading it to the server. In other words, I want to download (or remote upload) the file to my server. This saves me a lot of pain. – Omar Apr 13 '18 at 14:10
  • You want to send the file, from server to server. There s plenty way to do that. If you have access to those server, then you need to open stream on it's input. Like mine is `fopen("php://input")`. After the stream is open, then you can use javascript to collect those `ArrayBuffer` and send it to the open stream server. – Mr Hery Apr 13 '18 at 14:40
  • I don't have admin access to the server hosting the file I want to get. I only have access to the file like any downloadable file distributed on the internet. I'm not sure if your mentioned method downloads slices to my PC first, then uploads the ready binary chunks of the file to my server through JavaScript. Correct me if I'm wrong please. I don't want any chunks to be downloaded first to my PC, because this will slowdown the process by my PC's connection speed. I want the files to be transferred directly from the host server to my server without any intermediate. – Omar Apr 13 '18 at 15:04
  • Using the `ReadFileAsArrayBuffer()` method will get the data, but still save in your computer memory temporary. If you dont want any `middle-man`, then you need the server access so you can make the server send the file – Mr Hery Apr 13 '18 at 15:10