0

Recently I had written script which read data from csv and do some check, as csv data is huge i want to run it in batches so first 50 lines and execute them and write it to one folder and then execute next 50 lines and write output in another folder

below is the line i used to import and export csv file

$P = Import-Csv -Path .\Processes.csv

and export using

Export-Csv -Path "Data"
kumar
  • 59
  • 1
  • 8

2 Answers2

1

TotalCount is the best way to deal these scenarios. Instead of importing the csv, my recommendation would be to use Get-content and pick the necessary lines required:

Get-Content .\Processes.csv -TotalCount 50 | Out-File .\Processes_first50.csv

Another recommendation would be to use pipeline and then Select -First

Get-Content .\Processes.csv | select -First 50 | Out-File .\Processes_first50.csv

The last option is to use the -head parameter:

Get-Content .\Processes.csv -Head 50 > .\Processes_first50.csv

The > is the redirecting the output to a file which is similar to outfile but much more elegant.

Hope it helps.

Ranadip Dutta
  • 8,857
  • 3
  • 29
  • 45
  • 1
    how should make sure my next count is next 50 and so on as records may be 200 – kumar Sep 19 '22 at 09:28
  • 1
    @kumar: In that case you have to use `foreach` loop and iterate them in batches. Search about foreach loop examples to get a direction. But thats an altogether separate question. – Ranadip Dutta Sep 19 '22 at 09:32
1

To compliment the helpful answer from Ranadip Dutta and answer the question in the comment: "how should make sure my next count is next 50 and so on as records may be 200?"

You might use this Create-Batch function, see also: Slice a PowerShell array into groups of smaller arrays:

Install-Script -Name Create-Batch

Example:

$BatchNr = 1
Import-Csv -Path .\Processes.csv |Create-Batch -Size 50 |ForEach-Object {
    $_ |ForEach-Object {
        $_ # do something with each item in the batch of 50
    } |Export-Csv ".\Batch$BatchNr.csv"
    $BatchNr++
}
iRon
  • 20,463
  • 10
  • 53
  • 79