In my career I never had to deal with files this large as the central aspect of my application and I am a bit stumped.
My application will record a lot of things in a file and it might grow to several giga bytes. The application will have to run on Windows (7 onward) and all modern Linux (kernel 4 onward), for both 32 bit and 64 bit machines
I tried using fread
and friends since they are portable but they all use long
. I mused splitting the file into multiple 2 GB files so as to continue working with signed long
, but this seems a bit too much processing overhead.
Is there a set of portable library functions that will help with the task, or do I have to write a thin wrapper around the platform API?
EDIT: To answer some questions in the comment.
- This is a binary data file.
- I don't need to load it all in memory but will have to seek to offsets.
- About using fstream, it seems it wouldn't work based on this, so I didn't factor it in.