1

This post is about some automation tasks, and also to satisfy my curiosity.

Is this scenario possible, can anyone offer any practical pointers?

Run a shell script

shell_exec(bash script);

Bash script like:

  • Run a shell;
  • read file for input;
  • pass input to shell;
  • fetch result from shell;
  • write to another file for output.
  • Keep in infinite loop.

Write input commands to file for example:

  • wait few seconds
  • read output file for result
  • depending on output, write new input commands to file
  • the loop continues.
OBV
  • 1,169
  • 1
  • 12
  • 25
  • 1
    Yes, this is possible. Where do the original "input commands" come from? – AbraCadaver Dec 05 '13 at 15:51
  • I will write to a file in a shared folder from a terminal on another box in the network. – OBV Dec 05 '13 at 15:52
  • 1
    Instead of a file you could make a little server which would also handle concurrent access, you can easily make one with php. Something like this http://stackoverflow.com/questions/14399801/how-to-use-named-pipes-in-php-between-different-functions-or-even-different-proc or http://squirrelshaterobots.com/programming/php/building-a-queue-server-in-php-part-3-accepting-input-from-named-pipes/ – Prix Dec 05 '13 at 15:53
  • The thing I was most unsure of was how to get shell 1 to pass commands/read output interactively with shell 2. Any ideas? – OBV Dec 05 '13 at 15:54
  • Fairly simple, give it a go... Sounds like only one shell that is executed each time through the loop. – AbraCadaver Dec 05 '13 at 15:55

2 Answers2

1

I'm going to leave a shameless link to a post where I demonstrate doing this, "expect in php": http://codehackit.blogspot.be/2012/04/automating-command-line-scripts-in-php.html

Basically it's just a wrapper around proc_open(), which returns FDs for writing and reading to another processes stdin/stdout. http://php.net/manual/en/function.proc-open.php

smassey
  • 5,875
  • 24
  • 37
  • Interesting, could you run this from command line and poll a file for user input on the fly as opposed to preset automation? (assuming you could prevent any timeouts) – OBV Dec 05 '13 at 16:03
  • 1
    I'm sure that what you're asking is possible although I admit that I don't see the use. You would simply oopen your file as a stream and use the stream_select function to multiplex the 3streams together. – smassey Dec 05 '13 at 17:00
  • @ smassey Hey, I have tried your script, experiencing a prob, can you advise? http://pastebin.com/c41snHpi When I run it it runs the program, but the browser is timing out.. seems like the test program is waiting on input and the script isn't entering any. – OBV Dec 05 '13 at 17:20
1

In order to avoid problems with partial reads and writes (due to io buffering and races), you may want to consider using a directory the in/out-box like so:

  1. Create your command files in the dir with "temporary" names (e.g. "cmd_`date +%s`.txt.tmp")
  2. When you're done writing to a given command file, close it (to flush the buffers), then rename it to remove the ".tmp". Rename is atomic within a filesystem.
  3. Have the consuming bash "daemon" only look at "cmd_*.txt" (not .tmp) and when it's done with a given command, either delete the cmd file or rename it to give it a ".done" suffix. (If you need multiple parallel worker daemons, you can probably even rename to ".processing" to "claim" a cmd while you work on it. Just be sure to check the return code of the rename when you do so to see if another worker out-raced you.)

Do likewise for the output files.

Rob Starling
  • 3,868
  • 3
  • 23
  • 40