Possible Duplicate:
How do you determine the ideal buffer size when using FileInputStream?
When reading raw data from a file (or any input stream) using either the C++'s istream
family's read()
or C's fread()
, a buffer has to be supplied, and a number of how much data to read. Most programs I have seen seem to arbitrarily chose a power of 2 between 512 and 4096.
- Is there a reason it has to/should be a power of 2, or this just programer's natural inclination to powers of 2?
- What would be the "ideal" number? By "ideal" I mean that it would be the fastest. I assume it would have to be a multiple of the underlying device's buffer size? Or maybe of the underlying stream object's buffer? How would I determine what the size of those buffers is, anyway? And once I do, would using a multiple of it give any speed increase over just using the exact size?
EDIT
Most answers seem to be that it can't be determined at compile time. I am fine with finding it at runtime.