EDIT eventually tested within the Fiddler Command Line Prompt, results are the same, this is not related to Powershell, but I keep the post as it is for context.
My issue: I am running a Powershell script to create an archive of my Fiddler Session.
$command_dump_fiddler = "`"C:\Users\[...]\AppData\Local\Programs\Fiddler\ExecAction.exe`" dump"
Invoke-Expression -Command "& $command_dump_fiddler";
I have slightly changed it to measure the execution:
$command_stop_fiddler = "`"C:\Users\[...]\AppData\Local\Programs\Fiddler\ExecAction.exe`" quit"
$command_dump_fiddler = "`"C:\Users\[...]\AppData\Local\Programs\Fiddler\ExecAction.exe`" dump"
Measure-Command { Invoke-Expression -Command "& $command_dump_fiddler"; Invoke-Expression -Command "& $command_stop_fiddler"; Wait-Process -Name "Fiddler" -ErrorAction Ignore }
What brought me here: my Fiddler logs have gotten heavier than before, but execution time seems to have even more increased compared to how my logs size have increased.
Now, I have done some testing and realised that dump
command is not linear. Here's an example with a .saz
file of 150Kb with 17000 IDs:
Dump Command takes 1 minute and 40 seconds
Now if I open it twice, so I just have twice the same data (300Kb and 34000 IDs)
Dump Command takes 5 minutes
My question: Why? .saz
is supposed to be an archive, having the same data twice should not take more than twice the time of the single data. Perhaps there is some settings I am missing?
Many thanks in advance