0

I want to send a huge data(approx 5MB) from my Angularjs application to nodejs server, my node server is hosted in kube's cluster due to some reason we are unable to update the nginx configuration,

So is there any possibility that I can send my json data in chunk's or convert the JSON data object to a file and send as multipart http request ? Is this a good idea?

Syed Ayesha Bebe
  • 1,797
  • 1
  • 15
  • 26
Ashish Yadav
  • 350
  • 3
  • 17
  • the app should set headers for "chunked". The app should explicitly end the stream . The server will send a "100- continue" and will sense the end-of-stream that the client writes to the req.objects.writestream. review transfer encoding chunked which the client needs to request in headers https://stackoverflow.com/questions/4824451/detect-end-of-http-request-body – Robert Rowntree Oct 09 '17 at 04:13
  • as long as you know the length of the file ahead of time, instead of using "chunked encodiing" you could verify that the server handles gzip wrapper on the body, then gzip the file adding that length to request.headers and POST that gzipd object as the 'entity.body' belonging to the request – Robert Rowntree Oct 09 '17 at 13:44

0 Answers0