0

Is it possible to download files with the help of AJAX requests via XHR2? I am able to upload files to the sever, but I can not find a way to download them.

  • PHP Version 5.4.12
  • Apache/2.4.4 (Win64) PHP/5.4.12

I need to find a solution to be able to download very large files (up to 4GB). To save server memory, the files need to be downloaded in chunks. I need to monitor download progress too, to provide feedback especially on large files.

I have tried many things, but nothing works well. I run our PHP memory, it takes very long to start the download, I can not monitor the progress, cURL is very slow, x-sendfile provides no download progress, or I can not run two PHP scripts at once(when one sends the data and the other monitors the progress.)

Bunkai.Satori
  • 4,698
  • 13
  • 49
  • 77
  • Practically you must store stuff either in memory or on disk, and if you are short on I suggest you to split the files on disk before sending them to the client. And if that's not possible you are out of luck, upgrade the server. – Eric Herlitz Oct 30 '13 at 21:11
  • @EricHerlitz, I understand you point. However, I would like to know, if there is a XHR2 functionality allowing me to download files through AJAX requests. That could solve my problems for long. – Bunkai.Satori Oct 30 '13 at 21:15
  • AJAX is the same as regular request, it's just executed via JS instead of redirecting your browser to another URL entirely. Why would it be any different for your server? What seems to be the problem is the way you're reading and outputting the file (smells like `readfile()` to me). – N.B. Oct 30 '13 at 21:17
  • @N.B. Hi. Maybe [this link explains](http://stackoverflow.com/questions/14682556/why-threre-is-no-way-to-download-file-using-ajax-request) it. Currently, I am using readfile(). That is correct. After trying everyhting else, readfile() looks to work in the best way. I can not unfortunatelly get progress of readfile(). Threfore, I hoped, XHR2 could have solution for me. – Bunkai.Satori Oct 30 '13 at 21:22
  • `readfile()` isn't an optimal solution and that's why you end up with no memory. What you want is fopen/fread combo. You open the file, read a chunk of 8kb, output it and so on (output as in `echo $read_file_chunk` or whatever you name the variable). As for monitoring - I don't know what you mean so I'll assume download progress bar. In order to have that, you need to send how big the file is with `header('Content-Length: '. filesize($file));` first. – N.B. Oct 30 '13 at 21:23
  • @N.B. exactly that is what I mean, to send progress information to the browser from the server. However, if the output buffer is full of file data while outputting, how to mix it with file size information so, that it will be delivered continuously to the browser and will not be injected into the file content? – Bunkai.Satori Oct 30 '13 at 21:28
  • You send all the headers you want first using the `header` function (content-type, content-size etc.), then you fopen() the file, fread() 8kb of it (you can play with the size to see what fits you the best), then echo that chunk. Now, as for the output buffer - [you might try this to force buffer flushing](http://us3.php.net/manual/en/function.ob-flush.php#90529). – N.B. Oct 30 '13 at 21:35
  • @N.B. that makes sense. So say, I am in the situation when 300MB file is being downloaded. Now, the client, probably JavaScript, needs to read the header and need to know, how much data has been downloaded, so it can display the progress information. How to extract this information during download process? That is basically my question. – Bunkai.Satori Oct 30 '13 at 21:38
  • Well, you can't download the file with JS to the disk.. you need to redirect the browser to the download page. But if it's a page that outputs a file (content-disposition: attachment) then the user stays on the same page and file downloads. If you want to create some sort of custom progress bar using JS and download the file to users' disk - I'm afraid it's a no go. – N.B. Oct 30 '13 at 21:41
  • That is my point. I want to create some sort of custom progress bar for the user. It must be doable, as there is already a soultion from cURL, however, that one is very slow. Another solution would be to issue AJAX calls to access session variable reflecting the progress on the server. However, for some reason, I can not run two PHP scripts at the same time on the server. – Bunkai.Satori Oct 30 '13 at 21:46
  • Sadly, you can't customize file download progress bar in browsers. You could develop an extension for Chrome/Firefox, use AJAX which would fetch the contents and store it in browsers' memory and then use that extension to try and flush the file to users' disk - could. Not should, nor am I quite sure whether it's possible to have a browser extension tamper with users' disk (I think it is). Other than that, you can attempt to do some sort of download/check session variable etc. However - why? Browser can tell you what's the status already, why do you need a custom solution? – N.B. Oct 30 '13 at 22:26
  • The reason for custom progress bar is consistency. My portal offers file uploads which contain nice upload progress bar. I want to use it for file downloads as well. I know, I see the information in the download page/download bar. However, I believe, it is not that consistent. I wish to provide user with estimated time remaining too, which is important for large files. – Bunkai.Satori Oct 30 '13 at 22:32
  • @N.B. Do you know, please, if two PHP scripts within one session can run in parallel? I am not able to achieve that. – Bunkai.Satori Oct 30 '13 at 22:33
  • Without knowing how you're failing to run two processes, I can't really say anything, can you describe how is it that you can't run two scripts at the same time and how you're trying to do so?. It is possible to run thousands of PHP scripts in parallel. As for consistency - people are used to their browsers and download progress bar. It is also questionable that someone would download a 4GB file and keep staring at your website and not browse another website or do something else, making your effort totally vain :) – N.B. Oct 30 '13 at 22:48
  • @N.B., you asked how I implement parallel PHP scripts. One script is meant to run longer time - it is the download script. The other script is called via AJAX requests and is supposed to return session variables of download status: [Please, see this link, with detailed iformation and code samples](http://stackoverflow.com/questions/19692282/php-script-in-iframe-blocks-other-code) – Bunkai.Satori Oct 30 '13 at 22:54
  • If your sessions are file-based, it's definitely that your second script can't read the session file until the other one ends and releases the lock. Solution is to avoid sessions and use a different layer for sharing information (database, redis, memcached, shared memory). – N.B. Oct 30 '13 at 22:58
  • @N.B., well I use `session_start()` and `session_write_close();`. However, my code did not work even, without sessions. At least I think, so. I have done so many tests. You believe, that despite I call `session_write_close();` the code might fail? – Bunkai.Satori Oct 30 '13 at 23:01
  • Have you tried to do a test with a big file which you output, instead of a page where you use sleep() to simulate workload? I'd share the data between two different php scripts using a memory-based layer rather than sessions due to possible locks. Also, you're testing all this on windows which don't work the same as linux and windows are outside of my domain of knowledge when it comes to filesystem operations so I can't really say much about why your code doesn't work. Judging by what's there - it should work. – N.B. Oct 30 '13 at 23:06
  • @N.B., of course I did tests with real file download using `fopen()`, `fread()`, and `print()`. It did not work. I therefore decided for simpler solution. I need to go sleep now, but I will come tomorrow. If you have anything to add, please do so. I will follow up tomorrow. Have a nice rest of the day. Thanks for your help. – Bunkai.Satori Oct 30 '13 at 23:11

0 Answers0