-1

I have a function:

function LocateIP ([String] $IPPool, [String] $IP) {
    return $IPPool.contains($IP)
}

where $IPPool would be a file with lots of ips and $IP would be, of course, the IP. function need to return true if $ip is inside the ippool.

This works great, problem arise when when i try to iterate a file of ips and then work line by line and running LocateIP function on it.

if the file holds more than 50k of ips and i do iterate line by line and checking, that could take lots of time, and, of course, more than that it gets much worse.

can i find another way that will help me to work with bigger files?

mklement0
  • 382,024
  • 64
  • 607
  • 775
Shahar
  • 461
  • 5
  • 16
  • 2
    $ippool is nothing more than a parameter/variable that is explicitly cast to a single string, not a file. You need to show your actual code and problem you are having. – Doug Maurer Nov 14 '21 at 20:57

1 Answers1

1

I imagine there are plenty of ways to speed it up:

  1. Switch from ForEach-Object { } to foreach() { }

  2. Don't call a PowerShell function for every line. Instead of calling LocateIP, run the $line.contains($ip) test directly. PowerShell function calls have a lot of overhead.

  3. Avoid using Get-Content to read the file, when you could use [System.IO.File]::ReadAllLines() instead.

  4. Push all the work down to a faster cmdlet such as Select-String -Pattern ([regex]::Escape($IP)) -Path file.txt

  5. IP addresses form a tree structure; if you need to load the file once then do lots of checks, you could make a structure with faster lookup performance.

TessellatingHeckler
  • 27,511
  • 4
  • 48
  • 87