0

I'm working on a drive cleanup script in PowerShell. As part of this I'm using the Get-ChildItem cmdlet to get a list of all files. There are a lot (50k+) of files.

I've thrown the output of the cmdlet into a CSV thinking it will save a bit of RAM. I then import that CSV and process line by line. My understanding is that only the current row of the CSV is loaded into RAM. Is that correct?

If the whole CSV is loaded into RAM as it is processed then this question is moot. In that scenario I'll just chuck the output into a variable for processing.

Ansgar Wiechers
  • 193,178
  • 25
  • 254
  • 328
  • 1
    It loads all . You can use Get-ChildItem cmdlet – Ramankingdom Jul 27 '17 at 09:10
  • 2
    50k of file descriptions is nothing, don't bother. – wOxxOm Jul 27 '17 at 09:10
  • This link will be help full https://stackoverflow.com/questions/33511772/read-file-line-by-line-in-powershell – Ramankingdom Jul 27 '17 at 09:12
  • Cheers for the replies. 50k of file descriptions is just my testing data. Production is far more. Does Get-Content not load the whole file into RAM also? If possible I'd like to only load the current row into memory. – Charlie Miller Jul 27 '17 at 09:33
  • If you are worried about memory use "gci * | foreach-object {}" instead of "$files = gci *;foreach ($file in $files) {}". The first one will send the files down the pipeline one at a time. – kdh Jul 27 '17 at 10:42

1 Answers1

0

First of all dont bother about 50k+ data. It wont take that much amount of memory.

Secondly, it won't load row wise at any time unless you are specifying only to load that. It will load all.

Third, if that much critical, you can think of breaking it in chunks which will help you to reduce the size.

In case of get-content also, it will load as a whole. even though we have a -tail swich

Ranadip Dutta
  • 8,857
  • 3
  • 29
  • 45