0

Our company manages over one hundred servers and we would like to "ask" these servers for basic usage info once or twice a day using http. The usage info can be easily found with a perl cgi script and we would like to have an http interface to ease the creation of scripts and testing them. It seems an overkill to have apache, or even nginx+fcgiwrap, to serve one or two requests per day. We were thinking of using openbsd-inetd (which is already installed in all the servers) to launch a web server that could easily pass the request to the perl cgi script and the quit. What are good alternatives to do this?

I've managed to get this perlscript.pl to work, but I'm not sure if it is the right approach.

#!/usr/bin/perl                                                                                                                                                                                           

use strict;
use warnings;

{
    package BackupWebServer;

    use HTTP::Server::Simple::CGI;
    use base qw(HTTP::Server::Simple::CGI);


    my %dispatch = (
        '/hello' => \&resp_hello
        );


    sub net_server { 'Net::Server::INET' }

    sub handle_request {

        my $self = shift;
        my $cgi  = shift;

        my $path = $cgi->path_info();
        my $handler = $dispatch{$path};

        if (ref($handler) eq "CODE") {
            print "HTTP/1.0 200 OK\r\n";
            $handler->($cgi);
        } else {
            print "HTTP/1.0 404 Not found\r\n";
            print $cgi->header,
            $cgi->start_html('Not found'),
            $cgi->h1('Not found'),
            $cgi->end_html;
        }

    }


    sub resp_hello {

        my $cgi  = shift;   # CGI.pm object                                                                                                                                                               
        return if !ref $cgi;

        my $who = $cgi->param('name');

        print $cgi->header,
        $cgi->start_html("Hello"),
        $cgi->h1("Hello $who!"),
        $cgi->end_html;

    }


}

BackupWebServer->new()->run(
    log_file => 'Sys::Syslog',
    user => 'root',
    group => 'root'
    );

With an inetd.conf having

8901    stream  tcp     nowait  root    /home/perl/scriptname.pl
Ricardo Marimon
  • 10,339
  • 9
  • 52
  • 59
  • Have a look here: http://stackoverflow.com/questions/21321042/simplest-way-to-host-html/21321291#21321291 – Mark Setchell Feb 14 '14 at 16:52
  • 3
    It may be overkill, but Apache is also very simple in that it's easily installed, supported and configured. Is running an Apache instance that's idle most of the time anyway really such a big performance hit? I wouldn't think so. And there is monetary value in something that's easy to set up and maintain vs. a more complicated home-rolled solution that just saves a few MB. – TypeIA Feb 14 '14 at 18:46
  • 1
    A few MB of disk space, at that. An unused deamon will simply end up swapped out if RAM is needed. – ikegami Feb 14 '14 at 19:32
  • Why complicate things by using HTTP when you can just run a script over SSH as ikegami suggested? `ssh myhost /path/to/command arg1 arg2 ...` – ThisSuitIsBlackNot Feb 14 '14 at 21:45
  • I need to restrict ssh access to the servers as much as possible. It seems extreme that if we need to get some basic server stats we need to have ssh access to the server. This is why nagios uses nrpe instead of just ssh into the servers. – Ricardo Marimon Feb 15 '14 at 03:51
  • How is that extreme? You can limit the commands that can be run over SSH using the `authorized_keys` file (you could limit yourself to a single command if you want). You can create a user with restricted privileges to run the script. Even if the script needs to be run as root, you could do that in `cron` on each machine and use an unprivileged user to connect with SSH and gather the results. Note that NRPE is not incredibly secure, unless you apply a third-party patch; that's why some people use `check_by_ssh` instead. – ThisSuitIsBlackNot Feb 18 '14 at 15:49

1 Answers1

2

If you don't want to add a deamon to those machines, then you'll have to use an existing one. I presume SSH is installed? I'd use that. Possibly a more secure solution that using HTTP anyway.

ikegami
  • 367,544
  • 15
  • 269
  • 518
  • 2
    You don't even need to install the script on the remote system if you don't want to. You could execute `perl` with no arguments and feed the script to its STDIN over the socket. – ikegami Feb 14 '14 at 19:37
  • I was thinking on using inetd... to run a perl cgi script. – Ricardo Marimon Feb 14 '14 at 20:30
  • CGI is a limited means of communicating between a process and it's child. It won't help here. You'd have to go a lot of work to setup an environment you don't even need. Maybe you meant you'd pass information using `application/x-www-form-urlencoded` for the request and/or response, but JSON would be easier. – ikegami Feb 14 '14 at 20:34