If I have a file with billions of characters, what would be the fastest ways to read from it without using external libraries (asking mainly for competitive programming)?
I have found an article where it showed this example and said it was faster to use the InParser
class rather than ifstream
(here's the link, but it's not in english, so I don't think it helps with much: https://infogenius.ro/parsare-cpp/). It says that when you read directly with ifstream
, there are a few things happening under the hood that slows things down, so instead it's better to just read the whole file as a string and then do the conversion to other datatypes (int in this example) yourself.
class InParser {
private:
vector<char> str;
int ptr;
ifstream fin;
char getChar() {
if (ptr == (int) str.size()) {
fin.read(str.data(), str.size());
ptr = 0;
}
return str[ptr++];
}
template<class T>
T getInt() {
char chr = getChar();
while (!isdigit(chr) && chr != '-')
chr = getChar();
int sgn = +1;
if (chr == '-') {
sgn = -1;
chr = getChar();
}
T num = 0;
while (isdigit(chr)) {
num = num * 10 + chr - '0';
chr = getChar();
}
return sgn * num;
}
public:
InParser(const char* name) : str(1e5), ptr(str.size()), fin(name) { }
~InParser() { fin.close(); }
template<class T>
friend InParser& operator>>(InParser& in, T& num) {
num = in.getInt<T>();
return in;
}
};
int main() {
InParser fin("file.in");
int a; int64_t b; fin >> a >> b;
cout << a + b << '\n';
return 0;
}
My questions are:
Is this approach faster than just using
std::ifstream fin("file.in");
?Are there even better methods to do this without using external libraries?