I'm currently working on a method that reads files(mostly .PDFs) bytes and encodes them to base64. I'm however somewhat concerned about possible memory issues when using File.ReadAllBytes() should someone want to attach a really large file so I wondered if it would be possible to split the file, read the split parts' bytes and concat them into the final string which I'll then encode and work into the XML in the next step.
Asked
Active
Viewed 573 times
0
-
yeah you can stream the results to a file. However you are just kicking the ball down the road next time you read the file – TheGeneral Feb 01 '21 at 03:34
-
is what you are looking for already answered here? https://stackoverflow.com/questions/5208592/reading-parts-of-large-files-from-drive. Though if the end result is to concat them into a final string, then you're still going to have the entire file sitting in memory at some point. – Peter Henry Feb 01 '21 at 04:01
1 Answers
1
I am not sure if this will solve your problem, because you will likely need to process this file again anyway.
However, Here is how you could stream the conversion a file to a Base64 file. It uses a 80 * 1024
buffer which is likely only suitable to SSD and yet will not allocate on the large object heap, you can change it to a smaller buffer if you have a crayon drive (HDD) the default is 4k
using var inputFile = new FileStream(
@"D:\SomeFile",
FileMode.Open,
FileAccess.Read,
FileShare.None,
80 * 1024, // large buffer, yet not LOH
FileOptions.SequentialScan);
using var base64Stream = new CryptoStream(
inputFile,
new ToBase64Transform(),
CryptoStreamMode.Read);
using var outputFile = new FileStream(
@"D:\SomeOtherFile",
FileMode.Create,
FileAccess.Write,
FileShare.None,
80 * 1024); // large buffer, yet not LOH
base64Stream.CopyToAsync(outputFile);

TheGeneral
- 79,002
- 9
- 103
- 141