0

I've created a function which hashes a file (~2GB) and outputs it into a another file, however buffer size appears to correlate directly with the speed at which this is calculated. When the buffer set to [1024] it is noticeably slower than when it is set to [1048576] for example.

Increasing the buffer beyond [1048576] slows this down however, and I wondered why this was so?

It appears that 1MB is the ideal size, I'm just not sure why! Thank you in advance.

Joseph Smith
  • 129
  • 1
  • 8
  • 1
    maybe that's the size of your cache in your system? – asr9 May 05 '19 at 23:22
  • 1
    This previous question (and its links to other SO questions) "[how the cache size and array size affect the performance of mathematical operations on an array?](https://stackoverflow.com/questions/19083642/how-the-cache-size-and-array-size-affect-the-performance-of-mathematical-operati)" may be useful. – Weather Vane May 05 '19 at 23:26

0 Answers0