0

I have a PS script with invoke-command. I want to update the value if I haven't retrieved anything for that machine before.

This command takes long time. We have about 10000 computers inside computers CSV file.

Invoke-Command -ComputerName accepts an array of computers

How can I speed up the this script?

Here is my script :

$computers = Import-Csv "C:\temp\computers.csv"

$AllResults = foreach ($computer in $computers) {
    try {
      if (($computer.Appx -eq 'Port closed or computer unreachable') -Or ($null -eq $computer.Appx)){
       Invoke-Command -ComputerName $computer.name -ScriptBlock { Get-AppxPackage -AllUsers } -ErrorAction 'Stop' | ForEach-Object {
        [pscustomobject]@{
            "ComputerName" = $computer.Name
            "Appx"         = $_.Name -join ";"
          }
        }
      }
    }
    catch {
        [pscustomobject]@{
            "ComputerName" = $computer.Name
            "Appx"         = "Port closed or computer unreachable"
        }
    }
}

$AllResults | Export-Csv -Path "C:\Temp\MSStoreReport.csv" -NoTypeInformation -Encoding 'UTF8'
Tom Newton
  • 91
  • 1
  • 6
Arbelac
  • 1,698
  • 6
  • 37
  • 90
  • 1
    The best way is to splits the csv file task in parallel. The would mean the output file would also be split. Better would be to use a database that contains the csv data and results. Use a database that is multi-threaded like SQL Server. – jdweng Apr 18 '23 at 09:23
  • Simple way to go parallel: create a script that queries local computer and generates csv, then copies results to network share. Push that script on computers. Invoke it with Task Scheduler and combine the results later. – vonPryz Apr 18 '23 at 09:49
  • `invoke-command $computers { script... }` would run in parallel – js2010 Apr 18 '23 at 11:54

1 Answers1

0

Use PowerShell 7 and the Foreach-Object -Parallel cmdlet parameter:

Import-Csv "C:\temp\computers.csv" |Foreach-Object -Parallel {
    $Computer = $_
    try {
      if (($Computer.Appx -eq 'Port closed or computer unreachable') -Or ($null -eq $Computer.Appx)){
       Invoke-Command -ComputerName $Computer.Name -ScriptBlock { Get-AppxPackage -AllUsers } -ErrorAction 'Stop' | ForEach-Object {
        [pscustomobject]@{
            "ComputerName" = $Computer.Name
            "Appx"         = $_.Name -join ";"
          }
        }
      }
    }
    catch {
        [pscustomobject]@{
            "ComputerName" = $Computer.Name
            "Appx"         = "Port closed or computer unreachable"
        }
    }
} |Export-Csv -Path "C:\Temp\MSStoreReport.csv" -Encoding 'UTF8'

You might want to play with the -ThrottleLimit parameter to define howmany threads/remote computers you want to handle in parallel.

iRon
  • 20,463
  • 10
  • 53
  • 79
  • thanks. btw , invoke-command is multi-threaded out of the box. I believe invoke-command uses a default throttlelimit of 25. – Arbelac Apr 18 '23 at 16:26
  • *invoke-command is multi-threaded out of the box*, I guess you're right (you can start a background job with `-AsJob`) but that would require to check each job/computer status whether it is ready before retrieving the results. In your initial script you're running each job in a sequence as you wait for each job/computer to return its result before continuing with the next job/computer. `Foreach-Object -Parallel` will handle this invocation/retrieving in each thread and put the results on the pipeline a whole. – iRon Apr 18 '23 at 17:03
  • thanks again lastly I have a different question. https://stackoverflow.com/questions/76065787/updating-input-csv-file-based-on-server-status – Arbelac Apr 21 '23 at 07:40