Why is my entire file being erased?
Because you cannot read from and write back to the same file in the same pipeline, UNLESS you make sure that the input file is read into memory, in full, before its lines are sent through the pipeline.
To that end, enclose the command that reads the input file in parentheses ((...)
), which ensures that it is run to completion, with all output collected in memory and the input file closed again, before pipeline processing starts:
(Import-Csv -Path "C:\Part2List.csv") | # !! Note the (...)
... |
Export-Csv -Path "C:\Part2List.csv" -NoTypeInformation
Note that this approach bears a slight risk of data loss, if the pipeline is interrupted before writing all data back to the input file has completed.
A more robust approach is to write to a temporary file first, and, on successful completion (only), replace the original file.
With either approach, if the original file had special permissions, alternate data streams, ..., you may want to recreate these too.
As pointed out in Theo's answer, you have an additional problem: CSV column values are always imported as strings, so you must perform explicit conversions as needed:
Since you're doing a date comparison, you must convert $_.Date
to a [datetime]
instance; since your input is in format MM/dd/yyyy
, which is the invariant culture's[1]
short date pattern, you can simply cast to [datetime]
.
(Import-Csv -Path "C:\Part2List.csv") |
Where-Object { [datetime] $_.Date -gt (Get-Date) } |
Export-Csv -Path "C:\Part2List.csv" -NoTypeInformation
[1] PowerShell uses the invariant culture ([cultureinfo]::InvariantCulture
) rather than the current culture for to-and-from string conversions in most contexts. For more information, see this answer.