I often see 4096 used as a default buffer size all over the place. Is there any reason why 4096 was selected as opposed to another value?
Asked
Active
Viewed 5,658 times
9
-
11Jon Skeet picked that number. – michael Mar 07 '14 at 20:24
-
3I believe it has something to do with memory page sizes. – Colin Basnett Mar 07 '14 at 20:25
-
1@michael: Jon Skeet *invented* that number. Prior to his writing buffer code, the universe skipped from 4095 to 4097. (Couldn't resist. It's an old meme, but a fun one.) – David Mar 07 '14 at 20:25
-
4It's the size of a page. – David Heffernan Mar 07 '14 at 20:27
-
The answer is 42 http://en.wikipedia.org/wiki/Phrases_from_The_Hitchhiker%27s_Guide_to_the_Galaxy – Steve Mar 07 '14 at 20:28
-
@DavidHeffernan except if is the "page" is an A4 page, 210mm x 297mm – Federico Berasategui Mar 07 '14 at 20:32
-
3It ***is*** a common [page size](http://en.wikipedia.org/wiki/Page_(computer_memory)), 4*1024 or 4K. – Jeppe Stig Nielsen Mar 07 '14 at 20:36
1 Answers
5
It is realy depending on your problem but a general compromise solution for the problem is 4KB. A good description for this choice you will find it under the below listed links:
File I/O with streams - best memory buffer size
C# FileStream : Optimal buffer size for writing large files?

Community
- 1
- 1

Bassam Alugili
- 16,345
- 7
- 52
- 70