I'm attempting to upload a file in chunks for a Java server to save it to a file share.
When I try to split the file, its 2MB size increases quite a bit. I understand this is expected when converting it to base64, but even as a binary string it does.
I was using resumable to do the chunking, which worked great on our local, test, and dev environments but broke on production. I don't really have the access to figure out what's going on there, and I'm short of time, so I'd like to figure out how to chunk it manually so I can send it up as a normal ajax request within a JSON variable, as I'm positive that will work.
Is it a folly to expect the chunked uploaded file to be the same as the one you eventually save to the remote fileshare? It seems like it should be able to be done.
I have already tried javascript FileReader - parsing long file in chunks
and most others use multiform data, runs into the aforementioned issue of not working in production.