0

I was solving a competitive programming task when I made a bug. I placed ios_base::sync_with_stdio(false), cin.tie(NULL) into solve() instead of main().

#include <bits/stdc++.h>
using namespace std;

void solve(){
    ios_base::sync_with_stdio(false), cin.tie(NULL);
    // implementation not important
}

int main() {
    int t;
    cin >> t;
    while(t--)
        solve();
}

This resulted in a memory error. However when I move it into the main() I pass with no issues:

#include <bits/stdc++.h>
using namespace std;

void solve(){
    // implementation not important
}

int main() {
    ios_base::sync_with_stdio(false), cin.tie(NULL);
    int t;
    cin >> t;
    while(t--)
        solve();
}

My understanding from this answer is that this disables the synchronization between C and C++ streams, and then unties cin and cout. Especially since I'm only using cin / cout, why does moving that line around cause such a drastic difference?

(in case you want the actual source)

Memory error submission

Passing submission

Primusa
  • 13,136
  • 3
  • 33
  • 53
  • It seems to be a [complier bug](https://gcc.gnu.org/bugzilla/show_bug.cgi?id=27931) (or [language feature](https://gcc.gnu.org/bugzilla/show_bug.cgi?id=27931#c6)?). – ph3rin Feb 17 '20 at 03:59
  • What compiler and version? – M.M Feb 17 '20 at 04:21
  • see also https://stackoverflow.com/questions/31816095/ , check if the problem still occurs after fixing this – M.M Feb 17 '20 at 04:21

1 Answers1

3

From [ios.members.static]/2:

Effects: If any input or output operation has occurred using the standard streams prior to the call, the effect is implementation-defined. Otherwise, called with a false argument, it allows the standard streams to operate independently of the standard C streams.

Furthermore, the cppreference page notes that

If this function is called after I/O has occurred on the standard stream, the behavior is implementation-defined: implementations range from no effect to destroying the read buffer.

I'm not entirely sure what exactly happened in your case. Reading the source of libstdc++ (e.g., here), which I assume is the standard library being used, it seems that it clears the buffers. This means that it's possible that some input you were waiting for was buffered, and your program either started reading from a strange spot or the reads just failed entirely. But the precise details and why that would have caused an out of memory error is beyond me.

(It's also worth noting that the fact you're calling it in a loop doesn't seem to be the problem in this implementation, as libstdc++ only clears the buffers if you're already unsynchronised with stdio.)

N. Shead
  • 3,828
  • 1
  • 16
  • 22
  • 1
    Given that some part of the input buffer may be lost, `cin >> n;` will fail at some point and put `cin` in fail state. Then OP is not initializing their variable `n` in `solve` and because `cin` is in fail state `cin >> n;` will not set it. Then they use that to construct vectors of that size. So that is undefined behavior and probably just creating vectors of random size (and accessing them). – walnut Feb 17 '20 at 04:24
  • @walnut Yeah, that was originally my thought too. I looked it up though and according to cppreference, since C++11 if `cin >> n` fails then `n` will be set to 0, in which case we're just creating zero-length vectors: which I think should be fine. It's possible I'm missing something though, or that cppreference is incomplete, or that GCC doesn't completely follow the standard on this point — I haven't looked much further into this. – N. Shead Feb 17 '20 at 04:28
  • 1
    I think that applies only to failure during extraction. If the stream is in a fail state, then extraction will not be tried at all. GCC/libstdc++ and Clang/libc++ [seem to agree](https://godbolt.org/z/K7sEyP) on this. – walnut Feb 17 '20 at 04:32
  • @walnut Good point; that's what I was missing! In which case yes, your thoughts on them accidentally creating arbitrarily large vectors through UB is probably the truth. – N. Shead Feb 17 '20 at 04:35