With this code:
cout << std::hex << cout.flags() << endl;
The compiler is allowed to evaluate it in this order:
ios_base::fmtflags f = cout.flags(); // store value before applying std::hex
cout << hex;
cout << f;
cout << endl;
So you're not guaranteed to "see" flag changes this way. However, it is not Undefined Behavior.
The flags are a "bitmask type", which is defined to have certain properties – the actual type used is implementation-defined, but integers, enums, and std::bitsets are possibilities. You can use the normal bit-manipulation operators: ^, &, |, and ~:
bool is_hex(std::ios_base &s) {
return (s.flags() & s.basefield) == s.hex;
}
// is_oct is identical, except with s.oct
// Nothing set in basefield means "determine base from input" for istreams,
// and ostreams use base 10. This makes is_dec harder to write.
bool is_anybase(std::istream &s) {
return (s.flags() & s.basefield) == 0;
}
bool is_dec(std::istream &s) {
std::ios_base::fmtflags base = s.flags() & s.basefield;
return base == dec;
}
bool is_dec(std::ostream &s) {
std::ios_base::fmtflags base = s.flags() & s.basefield;
return (base == dec) || (base == 0);
}
// Purposeful overload ambiguity on std::iostream.
// In 0x, we could write:
bool is_dec(std::iostream &s) = delete;
For an example, this is how std::hex works:
std::ios_base& hex(std::ios_base &s) {
s.setf(s.hex, s.basefield);
return s;
}
Where setf does:
ios_base::fmtflags fmtflags = s.hex; // first parameter
ios_base::fmtflags mask = s.basefield; // second parameter
s.flags((s.flags() & ~mask) | (fmtflags & mask));