0

I've created a web based kiosk app (php & javascript) that has to copy relatively large files (1 - 2GB) to the kiosk user's usb storage device. PHP's file copy is too slow, so I had to invoke using complicated exec function the native windows file copy operation, the problems I have are mainly two. 1) I can't get continuous feed back of the copy progress 2) I can not get reliable info when and if the copy has failed or interrupted and this is very important since its a self service kiosk app Does anyone have any ideas on how to achieve reliable, fast file copy and progress feedback from with in a browser? Thanks!

user3777916
  • 93
  • 1
  • 9
  • where are the files coming from? it sounds like a job well-suited for node.js. – dandavis Jun 26 '14 at 08:33
  • From an attached external harddisk – user3777916 Jun 26 '14 at 08:36
  • ok, so you're trying to copy a file from a local hard drive to a local thumb drive? – dandavis Jun 26 '14 at 08:37
  • Yes, that's exactly what I'm trying to achieve, also don't forget that I need to display the % copied and estimated completion time that is refreshed about every 2 seconds on the webpage. – user3777916 Jun 26 '14 at 08:44
  • the 2nd answer in http://stackoverflow.com/questions/11293857/fastest-way-to-copy-file-in-node-js might work. it seems that you can add a "data" event to the rd object in that answer (http://nodejs.org/api/stream.html#stream_event_data), and you would know how many bytes are read each execution, which you can then sum and push to a txt file, http port, etc in order to get it to the html. – dandavis Jun 26 '14 at 08:55
  • a trick i used to do with huge log files: you can break the copy operation into a series of appends instead of one big save. if you know how big each append is, and how many appends you've done, all givens, you can do a rough progress indicator. give it some css transition-duration and it will look smooth. – dandavis Jun 26 '14 at 08:58
  • Are you sure using node.js would be as fast as the native copy operation? – user3777916 Jun 26 '14 at 09:06
  • node.js has really good low-level IO libraries. i'm not sure of anything, but if you use pipe() to connect a couple streams, it will be about as fast as realistically possible from everything i've seen. note that node.js was literally invented to report progress back during large uploads since php sucks at that... – dandavis Jun 26 '14 at 09:14
  • Thanks. Will give it a try and report back here – user3777916 Jun 26 '14 at 09:21

0 Answers0