I'm using the following code:
const size_t MonaBuffSize = 1024 * 1000;
std::ifstream file(Path.string(), std::ifstream::binary);
MD5_CTX md5Context;
MD5_Init(&md5Context);
char buf[MonaBuffSize];
while (file.good()) {
file.read(buf, sizeof(buf));
MD5_Update(&md5Context, buf, file.gcount());
}
unsigned char result[MD5_DIGEST_LENGTH];
MD5_Final(result, &md5Context);
When I set MonaBuffSize to 1024 * 1024 - the program crashes when file.read function is called.
Are there any buffer size limits in ifstream.read() function? I know that it is probably better to use smaller buffers, but I would like to understand what is the problem here.
I'm using Visual Studio 2019, C++ 17.
EDIT: As recommended in comments, I returned to the previous method that I used with boost::iostreams::mapped_file_source:
const std::string md5_from_file(const std::filesystem::path Path, std::uintmax_t Size)
{
unsigned char result[MD5_DIGEST_LENGTH];
if (Size > 0) {
try {
boost::iostreams::mapped_file_source src;
src.open(Path.string());
MD5((unsigned char*)src.data(), src.size(), result);
}
catch (std::ios_base::failure const& e)
{
MD5((unsigned char*)"", 0, result);
//MessageBox(NULL, e.what(), "EXCEPTION", MB_YESNO | MB_ICONINFORMATION);
}
}
else {
MD5((unsigned char*)"", 0, result);
}
std::ostringstream sout;
sout << std::hex << std::setfill('0');
for (auto c : result) sout << std::setw(2) << (int)c;
return sout.str();
}
Additionally, I realized that it was Windows Defender slowing disk read speed so much, so my attempts to speed it up were pointless.