It is unclear as to whether Alamofire supports chunked data for large or progressive data sets. This is a much needed feature for my application, otherwise I may have to look into alternative methods.
On the Alamofire Github page, it states Progress Closure & NSProgress
but I'm not sure what that entails.
And per the description on Wikipedia on Chunked data transfer.
Senders can begin transmitting dynamically-generated content before knowing the total size of that content.
For clarity's sake, let me explain why I need this.
Basically I have a very large JSON file that is partially cached. The full JSON file is composed up of smaller JSON objects. I am using iojs
/ nodejs
to send the chunked data via res.write()
with Express
which knows not to send the Content-Length
header AND send it as chunked data. I have verified this works via html/js
.
Let me know if you would like for me to provide the code to demonstrate this!