4

I'm trying to compress IIS Log files (60GB total) using Compress-Archive, however I get the error:

"Exception of type 'System.OutOfMemoryException' was thrown."

$exclude = Get-ChildItem -Path . | Sort-Object -Descending | Select-Object -First 1
$ArchiveContents = Get-ChildItem -Path . -Exclude $exclude | Sort-Object -Descending
Compress-Archive -Path $ArchiveContents -DestinationPath .\W3SVC2.zip -Force

I've already adjusted MaxMemoryPerShellMB to 2048MB and restarted the WinRM service.

RAM Memory consumption exceeds 6GB when the command is executed.

Matt
  • 45,022
  • 8
  • 78
  • 119
joebegborg07
  • 821
  • 3
  • 14
  • 27
  • Are you running the code directly on the server that contains the files or remotely? – henrycarteruk Apr 25 '17 at 16:45
  • 1
    6 GB is over your limit that you set at 2 GB. There is always the silly workaround of setting up the command as a scheduled task, so that way it's not running in a remote session. Or you could increase MaxMemoryPerShellMB. Or you could use a different less memory intensive process to compress like 7zip. – BenH Apr 25 '17 at 17:16
  • 1
    You could also compress in batches as supposed to one large file. – Matt Apr 25 '17 at 17:17
  • @BenH I'm aware of 7zip, however I'm trying to stay away from opensource tools as much as I can; unless it's the last resort. Additionally thanks for your suggestions. By any chance is it possible to use 'compress-archive' cmdlet with minimal system resources such as tools like 7zip ? – joebegborg07 Apr 25 '17 at 18:13
  • [This function](https://github.com/santysq/Compress-BigFiles) overcomes the 2Gb limitation of `Compress-Archive`, is faster and can compress multiple files and folder by allowing pipeline input. Originally posted as answer to [this question](https://stackoverflow.com/q/72607926/15339544). – Santiago Squarzon Jun 18 '22 at 06:29

2 Answers2

3

As suggested by Matt, I would recommend archiving it into sets of files. Something like:

For($i=0;$i -le $ArchiveContents.Count;$i=$i+10){
    Compress-Archive -Path $ArchiveContents[$i..($i+9)] -DestinationPath .\W3SVC2-$i.zip -Force
}

That way you're only working with 10 files at a time. You may want to adjust the number of files, depending on how many you have, and how large they are, ie: If you have 300 files at 200MB each then 10 at a time may be good, whereas if you have 3000 files at 20MB each you may want to increase that to 50 or even 100.

Edit: I just looked, and Compress-Archive supports adding files to an archive by specifying the -Update parameter. You should be able to do the same thing as above, slightly modified, to make 1 large archive.

For($i=0;$i -le $ArchiveContents.Count;$i=$i+10){
    Compress-Archive -Path $ArchiveContents[$i..($i+9)] -DestinationPath .\W3SVC2.zip -Update
}

That should just add 10 files at a time to the target archive, rather than trying to add all of the files at once.

TheMadTechnician
  • 34,906
  • 3
  • 42
  • 56
  • Many thanks for your suggestion. I'll keep you updated If I have trouble executing it. I was hoping to cut the operations to simply 1 .zip file as it simplifies the restoration process. Are you aware if it's possible to limit powershell to use 'x' amount of memory to execute the code ? – joebegborg07 Apr 25 '17 at 18:05
  • I am not aware of how to limit it, but I did update my answer to only use 1 archive file. – TheMadTechnician Apr 25 '17 at 18:38
  • You can also trigger garbage collection manually to free memory on each iteration. – Mark Wragg Apr 25 '17 at 19:03
  • @TheMadTechnician many thanks sir. I used your script to solve my issue. – joebegborg07 Apr 27 '17 at 12:00
  • Note that `-Force` and `-Update` switches are in separate parameter sets, so they cannot be used simultaneously. – Bronx Sep 21 '18 at 21:57
  • @Bronx Thank you, I have updated the answer accordingly. – TheMadTechnician Sep 24 '18 at 18:00
1

The below solution provide enhanced performance better than Compress-Archive

Create OptimalCompression.ps1

$source = $args[0]
$destination = $args[1]
Write-host "Compressing $source to $destination"
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
$includeBaseDirectory = $false
Add-Type -AssemblyName System.IO.Compression.FileSystem
[System.IO.Compression.ZipFile]::CreateFromDirectory($source,$destination,$compressionLevel,$includeBaseDirectory)

Run the Command with required parameters

C:\> powershell .\OptimalCompress.ps1 c:\YourSourceFolder c:\YourDestinationFile.zip
wallybh
  • 934
  • 1
  • 11
  • 28
0xFK
  • 2,433
  • 32
  • 24