3

By Default stream_get_contents waits and listen for 60 seconds. if i have multiple streams and want to listen all of them continuously.

Inside foreach if i am listening one stream, i can't listen other ones.

What is the solution to listen and capture stream output for all streams continuously?

while(true){
   //$streamArray is an array of streams obtained by stream_socket_client("tcp://..);
   foreach($streamArray as $stream){
        fputs($stream,$command);
        stream_get_contents($stream);  // and update file/DB (Blocking call)
        }
    }

Note: For every stream i have already done stream_set_blocking( $stream , true );

Update:

My requirement is to listen all streams for some time say 30 mints. at the same time i can't listen 2 stream. if i have 5 streams, my code is just time division multiplexing, out of 30 mints each individual stream will be recorded only for 6 mints

I have one solution that make AJAX request for individual stream and record independently. Certainly i don't want to do this multiple AJAX call method as it will result in more code as well as more CPU.

P K
  • 9,972
  • 12
  • 53
  • 99
  • My requirement is to listen all streams for some time say 30 mints. at the same time i can't listen 2 stream. if i have 5 streams, my code is just time division multiplexing, out of 30 mints each individual stream will be recorded only for 6 mints – P K Jul 28 '12 at 23:32
  • I think you need to run multiple instance of php to achieve that. – Anze Jarni Jul 28 '12 at 23:37
  • Yes Anze, i guess i have to run multiple independent ajax calls. Do we have parallel monitoring in PHP by single instance? – P K Jul 28 '12 at 23:40
  • Well you can use APC or MemCache to get variables global on all instances. If you are going to ajax from the same browser, you might use $_SESSION to keep things in sync. I hope that is what you ment? – Anze Jarni Jul 28 '12 at 23:42
  • related: http://stackoverflow.com/questions/1432477/can-php-asynchronously-use-sockets – Kaii Jul 29 '12 at 00:00
  • @AnzeJarni when two php processes use the same session, one process blocks on `session_start()` until the other process closes the session (and removes the session lock). Now we have the same blocking issues you try to avoid using multiple PHP instances. In addition to APC and memcache, IPC sockets may be an option here. But then again you have one parent process that must feed all the streams from APC, memcache and IPC sockets together. Asynchroneous / non-blocking reads are the answer. – Kaii Jul 29 '12 at 10:42

3 Answers3

7

with stream_set_blocking($resource, true) you start reading the stream(s) synchronously, which means each call to fread() waits until there is data to read. You then invoke stream_get_contents(), which reads from the blocking stream until it reaches EOF (is closed). The result is that you read one stream after another and not "simultaneously".

Reading streams this way is common and the easiest to code, but when you want to handle multiple streams simultaneously, you must handle the buffers, timing and "end of stream" detection yourself. With your current code this part is abstracted away from you via blocking streams and stream_get_contents().

To read multiple streams simultaneously, you should rewrite your code to read the stream(s) asynchroneously.

// $streamArray is an array of streams obtained by stream_socket_client("..");

$buffers = array();
$num_streams = 0;
foreach($streamArray as $stream) {
    // set streams non-blocking for asynchroneous reading
    stream_set_blocking($stream, false);
    // write command to all streams - multiplexing
    fputs($stream, $command);
    // initialize buffers
    $buffers[$stream] = '';
    $num_streams++;
}
while($num_streams > 0) {
    // use stream_select() to wait for new data and not use CPU at 100%.
    // note that the stream arrays are passed by reference and are modified
    // by stream_select() to reflect which streams are modified / have a
    // new event. Because of this we use a copy of the original stream array.
    // also note that due to a limitation in ZE you can not pass NULL directly
    // for more info: read the manual    https://php.net/stream_select
    $no_stream = NULL;
    $select_read = $streamArray;
    stream_select($select_read, $no_stream, $no_stream, null);

    // if there is new data, read into the buffer(s)
    foreach($select_read as $stream) {
        $buffers[$stream] .= fread($stream, 4096);
        // check if the stream reached end
        if (feof($stream)) {
            $key = array_search($stream, $streamArray);
            // close stream properly
            $num_streams--;
            fclose($stream);
            // remove stream from array of open streams
            unset($streamArray[$key]);
        }
    }
}
// do something with your buffers - our use them inside the while() loop already
print_r($buffers);
Kaii
  • 20,122
  • 3
  • 38
  • 60
  • usleep is not really nice. use the `select` call. – Karoly Horvath Jul 29 '12 at 00:06
  • 1
    1) remove the stream for the read array when it's finished 2) for timeout use `null`, "Using a timeout value of 0 allows you to instantaneously poll the status of the streams, however, it is NOT a good idea to use a 0 timeout value in a loop as it will cause your script to consume too much CPU time.". – Karoly Horvath Jul 29 '12 at 09:47
  • @KarolyHorvath thx again for pointing out some leaks in my untested code – Kaii Jul 29 '12 at 10:37
  • @kaii Also how would you differentiate in a particular stream between that stream disconnecting, or just no new data from it? – Attila Szeremi Sep 11 '16 at 10:49
  • 1
    @attilaszeremi PHP abstracts that away from you. That's what feof() does in above code example. – Kaii Sep 11 '16 at 16:02
1

PHP runs as a single process on the server. The "best" way to do something like this would be to spawn a thread for each stream. Each thread would have a single stream to listen to. Multithreading automatically takes advantage of multiple CPUs and allows them to run in-parallel.

Unfortunately, PHP does not allow you to create threads. There are, however a couple options for work-arounds (note: this question may be of some benefit to you).

Option 1: Clone the Process

PHP does allow you to clone the current process with functions like pcntl_fork. Basically, you can create a copy of the current PHP process, and that process will take over and start running. Just do that for each stream that needs to be listened to. The complication will be making sure each process listens to the right stream. (Hint: have a list of streams, and use a flag to the next stream in the list. Then do the fork [clone the process]. The child process will start listening to that stream. But, in the parent, update the flag to the next stream in the list, and create another child, and such on...)

Option 2: Multiple PHP instances

You can use a Cronjob, multiple calls, or some other service (even a command-line call) to call each PHP script, or pass a parameter to the script to identify which stream to listen to. This will run each PHP script/file in parallel, and will achieve what you want.

Option 3: Don't use PHP

PHP wasn't technically built for stuff like this, and this could probably be easily achieved in C, C++, or Java. You should re-think your setup.

Option 4: Create PHP or Apache Module

You can write a module in C for PHP or Apache, and then use your function calls from within PHP. This will probably be pretty complicated, and I don't recommend it.

All-in-all, I recommend rethinking your setup and not using PHP. But if you are confined to PHP, then multiple AJAX calls is the way to go.

Community
  • 1
  • 1
cegfault
  • 6,442
  • 3
  • 27
  • 49
  • @Praveen this answer is unrelated to your question. Multiple threads / processes can indeed solve your problem, by seperating the streams in multiple scripts, each reading one stream at a time. However, the real solution is asynchroneous reading. (see my answer) – Kaii Jul 28 '12 at 23:57
  • @Praveen additionally, this solution will definitely lead to more cpu consumption - what you try to avoid. – Kaii Jul 28 '12 at 23:57
1

What is the solution to listen and capture stream output for all streams continuously?

This issue has been troubling programmers for quite a long time. A book called UNIX Network Programming written by W. Richard Stevens would be a very good starting point if you are after a decent solution for your problem.

Kaii
  • 20,122
  • 3
  • 38
  • 60