0

I'm using this simplified PowerShell v1 script to collect some server statistics. The script is called form an external monitoring server (Zabbix). The collection of these statistics can take longer than 30 seconds (Zabbix timeout), so I am starting a second instance of the same script to do the gathering of data. Using this technique, the first instance of the script finishes within the Zabbix timeout, and the data gathering still takes place in the background:

Param([string] $action)

$path    = "\\SERVER\share\"
$file    = $path + "test.txt"

switch ($action) {
    'collect' {
        Write-Host "collecting data, this will take a while..."

        # pause the script to simulate processing time:
        Start-Sleep -s 10

        # for testing, create an empty testfile:
        New-Item -ItemType File $file | Out-Null
        break
    }
    'execute' {
        $script = $path + "ForkedExecute.ps1 collect"
        $vars   = "-ExecutionPolicy ByPass"
        $out    = $path + "stdout.txt"
        $err    = $path + "stderr.txt"

        # without the -Redirect[...] options, the testfile is not created:
        # Start-Process $pshome\powershell.exe -ArgumentList "& `"\$script`" $vars"

        # the testfile is successfully created:
        Start-Process $pshome\powershell.exe -ArgumentList "& `"\$script`" $vars" -RedirectStandardOutput $out -RedirectStandardError $err
        break
    }
    default {
        Write-Host "error - no action defined"
    }
}

I now face the problem that without the -Redirect[...] options, the second instance of the script is started, but somehow fails to create the testfile. With the -Redirect[...] options in place, the testfile is created, but the first instance of the script has to wait for the second instance to be finished and will timeout eventually.

How can I fix this script, so that the 'collect' part is successfully run and the script does not timeout?

Ansgar Wiechers
  • 193,178
  • 25
  • 254
  • 328
mokum
  • 25
  • 5
  • Simple. Don't have two processes write to the same file at the same time. Can't you use background jobs for this instead of spawning multiple processes? – Ansgar Wiechers Mar 20 '17 at 10:53
  • @ansgar: I'm not writing to the same file at the same time. The script uses an action parameter. The first instance only processes the 'execute' part and then exits; the second instance only to processes the 'collect' part which writes the file. – mokum Mar 20 '17 at 12:53
  • For one thing your argument list should be an actual list (`'-ExecutionPolicy', 'Bypass', (Join-Path $path 'ForkedExecute.ps1'), 'collect'`). However, I'm still not getting what you're trying to accomplish here. Why do you think you need a second process? – Ansgar Wiechers Mar 20 '17 at 13:04
  • @ansgar: the second process does the data gathering which will take more than 30 seconds. If I call the 'collect' part directly from Zabbix, it times out after 30 secs. I thought I could bypass this timeout by having the script starting a second instance in the background to do the gathering. The first instance only fires the second instance and then exits, reporting the start of the script back to Zabbix within the timeout period. I prefer to keep all logic in one script for easier maintenance and distribution. – mokum Mar 20 '17 at 15:08
  • In that case I would do the collection independently from Zabbix (e.g. as a scheduled task). Write the collected data to a temporary file and replace the actual data file with that temp file after the data collection completed. Then the Zabbix task can be limited to reading the existing data file. – Ansgar Wiechers Mar 20 '17 at 15:35
  • @ansgar: for a single server, that would indeed be the easiest solution. But for the amount of server we are monitoring, I'd like to stick to the one-script-to-rule-then-all scenario. – mokum Mar 21 '17 at 08:14
  • That's what configuration management was invented for. Otherwise I recommend to fix the collection routine, so it doesn't take longer than 30 seconds. – Ansgar Wiechers Mar 21 '17 at 09:01
  • @ansgar: If all servers were in the same domain of network, that would be an option. But we monitor servers at numerous different customers. This perticular script is used to collect data with the Get-Mailbox and Get-MailboxStatistics using the Exchange PSSnapin. Loading this snapin already takes 10 seconds... Before I ran the script mailbox by mailbox (to keep below the timeout), but that had a negative impact on Zabbix performance. So now I run the script for all maiboxes in one go, writing the results to file and then sent that to Zabbix. – mokum Mar 21 '17 at 20:48
  • I'm now trying this approach by user1959190(http://stackoverflow.com/questions/651223/powershell-start-process-and-cmdline-switches). But I am afraid its getting more like finding a work-around for a work-around... – mokum Mar 21 '17 at 20:57

0 Answers0