I have a small c++ codebase that reads commands from stdin executes them and then outputs the result to stdout. I use the wide input streams: wcin and wcout for this. My problem is that large input line, in the size of 4000+ characters gets cut. I have tested this both on windows and osx and the problem is on both.
I have created a minimal program that illustrates the problem:
#include <iostream>
#include <string>
#include <sstream>
using namespace std;
int main()
{
const size_t bufferSize = 2 * 4096;
wchar_t lineBuffer[bufferSize] = {0};
wcin.getline(lineBuffer, bufferSize);
wstring line(lineBuffer);
wostringstream wos;
wos << L", state of wcin, badbit: " << wcin.bad();
wos << L", eof: " << wcin.eof();
wos << L", failbit: " << wcin.fail();
wcout << L"The input: " << line << wos.str() << endl;
return 0;
}
Note that the eof, failbit and badbit all look ok when the problem arises.
Code can also be found here with a test string in a comment: https://github.com/Discordia/large-std-input
I can kinda fix this by setting the buffer size of wcin to 4096 (note that that is smaller than the input, the getline buffer is large than the input though), by doing:
const size_t wcinBufferSize = 4096;
wchar_t wcinBuffer[wcinBufferSize] = {0};
wcin.rdbuf()->pubsetbuf(wcinBuffer, wcinBufferSize);
But this only pushes the problem a bit. If the input is large say 9000 characters (I have then upped the size of the wcin.getline buffer to 4 * 4096) the problem is present again.
What is the best way of doing this if I do not know how large the input will grow? Should I not use getline?