0

I have a txt with 19 500 000 of lines (each line is a full path of file or folder). I need to go with a foreach loop and get the ACLs, later save the output into CSV, but I need to split this CSV in files every 1000 lines.

I am at a loss; I appreciate any help or recommendation.

mklement0
  • 382,024
  • 64
  • 607
  • 775
Jimcesseg
  • 15
  • 3
  • 1
    Look into `Get-Content` with the `-ReadCount` parameter. Should be what you're looking for – Abraham Zinala Oct 10 '21 at 03:56
  • >Please refer [this URL](https://stackoverflow.com/questions/2016894/how-can-i-split-a-large-text-file-into-smaller-files-with-an-equal-number-of-lin) to fulfil your question – Narayana Lvsl Oct 11 '21 at 15:52

1 Answers1

1

As Abraham Zinala mentions, you might want to take advantage of Get-Content -ReadCount, to read the file in chunks of 1000 lines at a time, then output to CSV and continue with the next chunk:

$outputCounter = 1

Get-Content path\to\huge\file.txt -ReadCount 1000 |ForEach-Object {
  # Iterate over file paths in chunk, fetch ACL information, then write all 1000 records to a new CSV file
  $_ |ForEach-Object {
    Get-Item -LiteralPath $_ -PipelineVariable item |Get-ACL |Select @{Name='Path';Expression={$item.FullName}},Owner -ExpandProperty Access
  } |Export-Csv ".\path\to\output_${outputCounter}.csv" -NoTypeInformation

  # Increment output file counter
  $outputCounter++
}
Mathias R. Jessen
  • 157,619
  • 12
  • 148
  • 206