I have a bunch of data files I need to read in to some multidimensional container, all of which are of the following form:
a1,a2,a3,...,aN,
b1,b2,b3,...,bN,
c1,c2,c3,...,cN,
................
z1,z2,z3,...,zN,
I know from this previous question that a quick way of counting the total number of lines in a file can be achieved as follows:
std::ifstream is("filename");
int lines = std::count(std::istreambuf_iterator<char>(is), std::istreambuf_iterator<char>(), '\n');
This lets me know what z, the total number of data sets to read in, each of which contains N data points. The next challenge is to count the number of data values per line, for which I can do the following:
std::ifstream is("filename");
std::string line;
std::getline(is, line);
std::istringstream line_(line);
int points = std::count(std::istreambuf_iterator<char>(line_), std::istreambuf_iterator<char>(), ',');
I can be confident that each file has the same amount of data values per line. My question is, is there a nicer/faster way of achieving the above without resorting to using getline to and dumping a single line to a string? I was wondering if this could be achieved with stream buffers, but having done a bit of searching it's not quite clear to me.
Any help would be much appreciated, thank-you!