0

There are many questions with this title , but no one helps. My application should handle file uploads with size up to 2 GB and it should be done through a browser , not something like a FTP file uploader.
I'm aware of configuration settings should be done for example on php.ini . but there are some questions in my mind :

  1. Is something like uploadify library a good solution for this file size or there are better alternatives?

  2. because of the big size , maybe this is desirable to have pause/resume functionality in it. is it possible to implement this functionality through HTTP file transfer via a browser? if yes , how to?

  3. some people talk about some type of vulnerabilities like DOS ATTACKS. is it in this case a serious issue and what are the considerations for this types of attacks?

if there are any extra recommendations and suggestion , please tell me about them.

UPDATE: some have suggested to do the job using FTP file upload. should it be done by dedicating each user a FTP account to let them upload files using FTP clients such as FileZilla ? if so , how incoming processes should be handled . for example I give each user a directory like /home/user1 and he uploads his files into this directory. Now how should I fetch uploaded file data and save it to database according to user session data.
Generally , I mean how to script over this FTP file uploading system?
or if it is impossible please tell me.

please help.

Aliweb
  • 1,891
  • 3
  • 21
  • 44
  • 2
    Even if it's possible, 2 GB through a browser is still a *lot.* It's likely to take hours or days. If the connection gets flaky for just a second, aborted connections usually can't be resumed. Do you have a use case where you really need this on a regular basis? – Pekka Feb 14 '13 at 14:43
  • This is just theory coming from me, I haven't tried any of this in practice so here goes: you can use FileReader API to read the file using newer browser and then manipulate it to your liking. That's what uploadify does if the browser supports it, from what I can remember. That also means you can send file in small chunks to the webserver. The problem is gluing the pieces together. The other problem is unstable behaviour of browser when it comes to reading such large files. Googling FileReader API might help more. – N.B. Feb 14 '13 at 14:47
  • @Pekka웃 : my application should allow up to 2 GB file uploads. but I don't know what do you mean of "regular basis". I know it should be done via a browser not for example a FTP client. but if you mean for example using Flash , it is possible – Aliweb Feb 14 '13 at 14:49
  • @ali as said, it's possible, but not very practical. FTP or a custom uploader application would work better. – Pekka Feb 14 '13 at 14:52
  • @N.B. [This one](http://stackoverflow.com/questions/14876412/large-file-uploads-using-filereader-and-php) seems to use the method you've suggested, so it seems possible. Found it right below this question on my "Top Questions" page. :) – Carsten Feb 14 '13 at 14:56
  • @Pekka웃 : in the case FTP , how should I implement this? should it be done just like hosting providers do and give a FTP account to the user and force him to use some FTP agent or there is a better and more logical solution? – Aliweb Feb 14 '13 at 14:57
  • @Pekka웃 : would you please answer to my last update? – Aliweb Feb 15 '13 at 09:05
  • 1
    @ali you would have to create a FTP account for every user for perfect security, yeah. Usually, users then start a PHP script that processes their upload. It's possible to monitor FTP accounts for new files but you'd need a dedicated server for that. – Pekka Feb 15 '13 at 09:58
  • @Pekka웃 : well basic FTP uploads just upload a file to a directory and doesn't fire any script. I want to know how to fire a script and do some processes after a new file is uploaded – Aliweb Feb 15 '13 at 12:55
  • @ali as said, you'd need a dedicated server with root access for that. Do you have one? – Pekka Feb 15 '13 at 12:57
  • @Pekka웃 : yes I do. the files will be uploaded to a separate server with root access – Aliweb Feb 15 '13 at 13:14
  • 1
    @ali a starting point: [Monitor Directory for Changes](http://stackoverflow.com/q/511463) Your FTP server may have triggers too that can start a program when a new upload comes in, you'd have to check out what server you have running – Pekka Feb 15 '13 at 13:18

1 Answers1

0

pause/resume is not a standard feature of HTTP uploads.

It ought to be possible to get it working, but only with a lot of work, and probably quite flaky quality.

To implement resuming, you'd need the following:

  • a Javascript front end that can query the server and ask how much of the file was already uploaded.
  • the Javascript would also need to be able to break up the upload file into chunks in the browser (this is going to be very painful for the user with a 2GB file), and upload it starting at the point the upload stopped previously.
  • since the HTTP request sends the whole file in one go, if you want to be able to pause, the Javascript would also need to send it in chunks, and have an event handler that aborts the next chunk if the user presses 'cancel'.
  • your PHP program would treat each resume as a whole new file upload, so it would need to know that it's receiving a new chunk of an existing file, and use append mode.

If that sounds like a recipe for a flaky and error-ridden system, then you'd be right. I wouldn't want to use a site like this to upload a file. There's just too many things that can go wrong.

There's a tutorial here that may help, but I still don't think it's a good idea.

If I were you, I'd consider setting up an (S)FTP server for your uploads. It will be a lot less hassle.

SDC
  • 14,192
  • 2
  • 35
  • 48
  • would you tell me about the implementation of a FTP server for this case? please look at my last comment for the question – Aliweb Feb 14 '13 at 15:04
  • just provide an FTP account, and let the user decide what FTP software to use (maybe suggest a good/free one for them if they don't know, eg FileZilla). – SDC Feb 14 '13 at 15:07
  • I think users will hate this type of file upload. I'm looking for a user friendly approach , something like Youtube has implemented. – Aliweb Feb 14 '13 at 15:10
  • hmm, every FTP program I know of is pretty easy to use -- drag+drop from a folder to the server. But if you really think people can't cope with that, then by all means follow the tutorial I linked to. But be aware that it isn't going to be easy. Oh, and it definitely won't work with old browsers like IE8. Maybe you could try your users with the FTP option, even if it's just while you spend time writing the code fo the browser solution. – SDC Feb 14 '13 at 15:22