What's the most efficient (not prone to errors / "proper" in general) way (if exists) of handling data from files in C++, line by line? That is, only one line from a file will be used at a time to perform some lengthy calculations before moving to the next one. I've thought of following options but can't decide which one is more appropriate.
At the moment I'm doing something like (open, do all stuff, close at the end):
string line; fstream myfile; int numlines = 1000; myfile.open("myfile.csv"); for(int i = 0; i < numlines; i++){ getline(myfile, line); // do something using read data }; myfile.close();
Open and close right after data is read (wouldn't hurt speed too much as calculations go much longer than data import):
string line; fstream myfile; int numlines = 1000; for(int i = 0; i < numlines; i++){ myfile.open("myfile.csv"); for(int j = 0; j < i+1; j++) getline(myfile, line); myfile.close(); // do something using read data };
Read all data at once (would need to store in ~30x1000 2D array as
line
is split by commas into array):string line; fstream myfile; int numlines = 1000; double data[numlines][30]; myfile.open("myfile.csv"); for(int i = 0; i < numlines; i++){ getline(myfile, line); // split by comma, store in data[][] } myfile.close(); for(int i = 0; i < numlines; i++){ // do something using data[i][] };
Are there any pitfalls here or any of the above solutions is as good as any if it works? I'm thinking that maybe keeping file in open state for a few hours is not a good idea (maybe?), but keeping a large double 2D array in memory doesn't sound right as well...