0

I have a rails-api conected to an angular app with the main purpose of uploading different files to the server and make copies of them, until now it works perfectly but the biggest files i've had to upload so far are 500ish mb, and now i have to uploas arrays of files that tend to be more than 20 gb of data.

I use angular np-upload to upload my files with http method and carrierwave gem to manage file and folder creation on the server side but i don't want to do the same when the payload is this big, So now my question is there a way to make this differently?? like google drive folders that synchronize constantly and the user doesn't even realize, Iwant to acomplish something like tis with my aplication so that when the really big files are uploading the app doesn't freeze and the user can upload several things at once and keep interacting with the app even though is uploading or synchronizing files to the server.

Thanks in advance for any advice

fr3d0
  • 225
  • 1
  • 12
  • You are asking about a fairly complicated and difficult problem. I think it is a bit too broad for a single stackoverflow question. I would try researching about file uploads in the browser in general, and then search for specific techniques to handle large files. Regardless, here is one link I found that looks quite relevant: http://stackoverflow.com/questions/25810051/filereader-api-on-big-files – Daniel Waltrip May 17 '17 at 19:38
  • i really don't think it's that broad but let me sum it up... the question is: is there a way to upload files (10 gb of data or more) asynchronically from an angular app to server side rails-api module. note: it doesn't have to be rails i could just create a smaller app with another languaje if that's the best way to solve the issue. – fr3d0 May 17 '17 at 20:01

0 Answers0