I want to read some numbers from the standard input, process them, then read the next bunch of numbers.
I came up with the solution to read the EOF
character in a char
and clear the eofbit, failbit, and badbit. The following code works on Ubuntu 14.04 with GCC 4.9.2:
#include <iostream>
#include <vector>
int main() {
std::vector<double> a;
std::vector<double>::iterator it;
double x;
while(std::cin >> x) {
a.push_back(x);
}
std::cout << "First bunch of numbers:" << std::endl;
for (it = a.begin(); it != a.end(); ++it) {
std::cout << *it << std::endl;
}
// get crap out of buffer
char s;
std::cin >> s;
std::cin.clear();
// go for it again
while (std::cin >> x) {
a.push_back(x);
}
std::cout << "All the numbers:" << std::endl;
for (it = a.begin(); it != a.end(); ++it) {
std::cout << *it << std::endl;
}
return 0;
}
So, on Ubuntu I can type 1<Return>2<Return>^D
, get some output, type 3<Return>4<Return>^D
, get more output and the program terminates.
On Mac OS 10.10 however, using the same GCC version, the program will not accept the second round of input but outputs the first sequence of numbers twice after hitting ^D
the first time.
- Why is there inconsistent behavior? Is it possible to work around it?
- What would be the idiomatic way to accept input twice?
- In my use case, the first bunch of number may eventually be read from a file or pipeline. How can I ask for additional input interactively also in that scenario.