1

The below script is considerably cut down in terms of the number of user properties being requested and there are approximately 50,000 users to iterate through. My previous approach was to first get a collection of allusers then do a get-aduser on each one for the information, however that approach was taking a long time (up to 16 hours). With the approach in the script below where the get-aduser is only run once the script flies through doing approximately 20,000 users in 11 minutes. However at that stage you can suddenly see it slow down and eventually crashes out with the error pasted at the end of the code. Is there a way around this issue?

$csv = "Computers-{0:dd-MM-yyyy_HHmm}.csv" -f (get-date)
$UsrProps = "SamAccountName",
"AccountExpirationDate",
"accountExpires",
"AccountLockoutTime",
"BadLogonCount",
"badPwdCount",
"badPasswordTime",
"SamAccountName"


Get-ADUser -Filter *  -Properties $UsrProps -server $domain |
ForEach-Object {

    $hash = @{
        AccountExpirationDate = $_.AccountExpirationDate
        AccountLockoutTime    = $_.AccountLockoutTime
        accountExpires        = $_.accountExpires
        BadLogonCount         = $_.BadLogonCount
        badPwdCount           = $_.badPwdCount
        badPasswordTime       = $_.badPasswordTime
        SamAccountName        = $_.SamAccountName
    }
       
    $PSCustObj = [pscustomobject]$hash
    $results = $PSCustObj
    $results |
    select-object @{ l = "SamAccountName"; e = { [string]$_.SamAccountName } },
    @{ l = "AccountExpirationDate"; e = { [string]$_.AccountExpirationDate } },
    @{ l = "AccountLockoutTime"; e = { [string]$_.AccountLockoutTime } },
    @{ l = "BadLogonCount"; e = { [string]$_.BadLogonCount } },
    @{ l = "badPwdCount"; e = { [string]$_.badPwdCount } },
    @{ N = 'badPasswordTime'; E = { [DateTime]::FromFileTime($_.badPasswordTime) } }  | 
    export-csv "$PWD\Logs\$domain\Users\$csv" -noTypeInformation -Append
    
}


Get-ADUser : Exception of type 'System.OutOfMemoryException' was thrown. At line:231 char:5
+     Get-ADUser -Filter *  -Properties $UsrProps -server $domain |
+     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Get-ADUser], OutOfMemoryException
    + FullyQualifiedErrorId : ActiveDirectoryCmdlet:System.OutOfMemoryException,Microsoft.ActiveDirectory.Management.Commands.GetADUser
Itchydon
  • 2,572
  • 6
  • 19
  • 33
  • Approximately how much memory is in your VM? Have you tried beefing the memory to the maximum swap size on your host and just brute-forcing it? – Mooshua Aug 18 '23 at 18:56
  • It is not a VM and there is plenty of memory and CPU present (even when I get the error there is plenty available) – Itchydon Aug 18 '23 at 18:57

1 Answers1

3

You can greatly streamline your command, which may fix the out-of-memory problem (it definitely reduces overall memory consumption and speeds up your command):

Get-ADUser -Filter * -Properties $UsrProps -server $domain |
  Select-Object @{ l = 'SamAccountName'; e = { [string]$_.SamAccountName } },
    @{ l = 'AccountExpirationDate'; e = { [string]$_.AccountExpirationDate } },
    @{ l = 'AccountLockoutTime'; e = { [string]$_.AccountLockoutTime } },
    @{ l = 'BadLogonCount'; e = { [string]$_.BadLogonCount } },
    @{ l = 'badPwdCount'; e = { [string]$_.badPwdCount } },
    @{ N = 'badPasswordTime'; E = { [DateTime]::FromFileTime($_.badPasswordTime) } } | 
  Export-Csv "$PWD\Logs\$domain\Users\$csv" -NoTypeInformation

Presumably, memory pressure can still build due to deferred garbage collection, so the following variation may be necessary, which runs the .NET garbage periodically, after processing every batch of N users (tweak as needed):

$i = 0
Get-ADUser -Filter * -Properties $UsrProps -server $domain |
  ForEach-Object {

    # Run the garbage collector after processing N users.
    if (++$i % 1000 -eq 0) { [GC]::Collect(); [GC]::WaitForPendingFinalizers() }

    # Construct and output the output object for this user.
    [pscustomobject] @{
      SamAccountName        = [string]$_.SamAccountName
      AccountExpirationDate = [string]$_.AccountExpirationDate
      AccountLockoutTime    = [string]$_.AccountLockoutTime
      BadLogonCount         = [string]$_.BadLogonCount
      badPwdCount           = [string]$_.badPwdCount
      badPasswordTime       = [DateTime]::FromFileTime($_.badPasswordTime)
    }

  } |
  Export-Csv "$PWD\Logs\$domain\Users\$csv" -NoTypeInformation
mklement0
  • 382,024
  • 64
  • 607
  • 775
  • 1
    Wow this looks promising and much easier to read. Please forgive me, owing to the sheer size of the script I will not be able to make the changes and test until tomorrow - but thanks for this – Itchydon Aug 18 '23 at 19:32
  • Unfortunately although it does seem to run even faster with the code streamlined - I stll get the error. I have tried tweaking the number of users for the garbage collection (and even disabled it) but it still happens – Itchydon Aug 20 '23 at 12:09
  • 1
    Sorry to hear it, @Itchydon. Perhaps there is a way to retrieve the AD result in batches (I don't know and I don't have access to AD). `-ResultPageSize` controls "the number of objects to include in one page for an AD DS query." There's also `-ResultSetSize` for the _total_ number of objects to retrieve, but I wouldn't know how to implement a _paging_ mechanism with it, i.e. how to tell subsequent calls where to resume. Another though is to enumerate the OUs and call `Get-ADUser` for each. – mklement0 Aug 20 '23 at 12:55
  • 1
    Cheers @mklement0 - I will give both approaches a go! – Itchydon Aug 20 '23 at 15:57
  • Let me take it back- I have tried it again today and it works like a treat!!!! Don't know why it failed yesterday but I have managed to run the code successfully a few times so I am so happy! I did not need to play with the -resultPageSize , -ResultSetSize or enumerate the OUs in the end. Thankyou so much!!!!!!! – Itchydon Aug 21 '23 at 12:25
  • Glad to hear it, @Itchydon - I assume it was the 2nd snippet, the one with the periodic garbage-collection calls, that worked, right? With a batch size of `1,000`? – mklement0 Aug 21 '23 at 12:43
  • 2
    .. actually I have tested it both with and without - and it appears to work without it too. So the simplification of the code is what seems to have done it - but I have left the garbage collection in - just in case it helps for the future if the data grows – Itchydon Aug 21 '23 at 14:08