1

Lets say I have this piece of code below and I use it to run a simple command like ls and lets assume that the output is 100 lines long.

When we are doing cout << buff in the code below, does the output get streamed byte by byte, bit by bit, or line by line or what is the interval?

Lets say we want to print new_line! string end of every new line as we are streaming cout<<buff. How can we do so ?

I'm thinking in that while loop we should have something that should check that it just streamed (or about to stream) \n into our stringstream output, and it decides to do something after (or before) it streams into the output.

Initially I will just make this into data structure to store so I just need to know how to check for \n as we are bufferring and where to put the function that adds new_line! or whatever I want to do.

Here's the code:

#include <iostream>
#include <stdio.h>

using namespace std;
// imaginary 
// stringstream output;


int main() {
    FILE *in;
    char buff[512];

    if(!(in = popen("ls -sail", "r"))){
        return 1;
    }

    while(fgets(buff, sizeof(buff), in)!=NULL){
            // output << buff;
            // if there is a new line
            // output << "new_line!";

        cout << buff;
    }
    pclose(in);

    return 0;
}
Logan
  • 10,649
  • 13
  • 41
  • 54

3 Answers3

1

If you use the GNU libc then you could use getline() instead of fgets. See here for a more detailed explanation.

Otherwise, use whatever facility your system provides to read a line from a FILE*. If you really can't find such a facility (which would be very surprising), then at worst you can always use this approach. fgets is also a good solution provided you account for the buffer size limit and do some more buffering by yourself if required (check the doc, everything is explained).

The thing you gotta remember is that popen just returns a FILE* so anything that will work on a standard file will also work in your case. Examples are plenty on the web.


Now for the "interval" part of your question, I'm afraid it makes little sense. To begin with, it depends on what interval does the external command print new data to the standard output. In any case, you'll be able to read it in your program as soon as the external command writes it.

Community
  • 1
  • 1
syam
  • 14,701
  • 3
  • 41
  • 65
1

Just like fget with stdio. It will wait either until EOF (the program has closed) or until the buffer is filled.

So each iteration of the while loop will either get 512 bytes of data or the remainder after the program terminates or stdio is closed.

You can test with a small program like this:

#include <thread>
#include <chrono>
#include <iostream>

int main() {
    std::cout << "hello" << std::endl;
    std::this_thread::sleep_for(std::chrono::seconds(5));
    std::cout << " world" << std::endl;
}

you will notice despite the long sleep it will still attempt to read the 512 bytes and ends up closing when stdout does.

To manage the actually chunking in a standards compliant manner you can use std::fgetc and build your own buffer.

http://en.cppreference.com/w/cpp/io/c/fgetc

Your own getline might look like this:

std::string get_line(FILE* f) {
    char c;
    std::string out;
    while(c=std::fgetc(f)!='\n' && c!=EOF) {
         out+=c;
    }
    out+='\n';
    return out;
}


//...

//usage
while(!std::feof(in)) {
    cout << get_line(in);
}
111111
  • 15,686
  • 6
  • 47
  • 62
1

The pipe will buffer the data as it comes in (until it reaches some implementation limit that probably can't be easily found - but it's "plenty" - at which point the pipe will "block" when ls is writing to it, so ls will stall), unless you read it out faster, in which case your thread will block. If you are outputting to the screen, it's likely that your output will limit the rate.

A pipe as such is not "block-" or "line-buffered" - it's just a pipe - whatever is put in one end comes out the other end. Of course, if the internals of the "file management" ask for 4000 bytes, it may take until 4000 bytes have come through at the other end before the read operation completes. But most likely the internals of fgets is sane and won't ask for a huge amount of data at once.

Mats Petersson
  • 126,704
  • 14
  • 140
  • 227