I'm trying to split large files (3gb+) into chunks of 100mb, then sending those chunks through HTTP. For testing, i'm working on a 29 mb file, size: 30380892, size on disk: 30384128 (so there is no use of a 100mb limit condition at the moment).
This is my code:
List<byte[]> bufferList = new List<byte[]>();
byte[] buffer = new byte[4096];
FileInfo fileInfo = new FileInfo(file);
long length = fileInfo.Length;
int nameCount = 0;
long sum = 0;
long count = 0;
using (FileStream fs = new FileStream(file, FileMode.Open, FileAccess.Read))
{
while (count < length)
{
sum = fs.Read(buffer, 0, buffer.Length);
count += sum;
bufferList.Add(buffer);
}
var output2 = new byte[bufferList.Sum(arr => arr.Length)];
int writeIdx2 = 0;
foreach (var byteArr in bufferList)
{
byteArr.CopyTo(output2, writeIdx2);
writeIdx2 += byteArr.Length;
}
HttpUploadBytes(url, output2, ++nameCount + fileName, contentType, path);
}
In this testing code, i'm adding each buffer I read into a list, when finished reading i'm combining the buffer array into one complete array. The problem is, the result I get (output2 size) is 30384128 (as size on disk), so the file that get received in the server is corrupted.
What am I doing wrong?