Assuming that you only ever want to record the most recent run [see bottom if you want to record multiple runs] (PSv3+):
# Log start of execution.
[pscustomobject] @{ Username = $env:USERNAME; StartDate = $TimeStamp } |
Export-Csv -Notypeinformation $FilePath
# Perform script actions...
# Log end of execution.
(Import-Csv $FilePath) |
Select-Object *, @{ n='FinishDate'; e={ (Get-Date).toString("dd/MMM/yyyy HH:mm:ss") } } |
Export-Csv -Notypeinformation $FilePath
As noted in boxdog's helpful answer, using -Append
with Export-Csv
won't add additional columns.
However, since you're seemingly attempting to rewrite the entire file, there is no need to use
-Append
at all.
So as to ensure that the old version of the file has been read in full before you attempt to replace it with Export-Csv
, be sure to enclose your Import-Csv $FilePath
call in (...)
, however.
This is not strictly necessary with a 1-line file such as in this case, but a good habit to form for such rewrites; do note that this approach is somewhat brittle in general, as something could go wrong while rewriting the file, resulting in potential data loss.
@{ n='FinishDate'; e={ (Get-Date).toString("dd/MMM/yyyy HH:mm:ss") }
is an example of a calculated property/column that is appended to the preexisting columns (*
)
The other weird thing is that it puts the StartDate first while I clearly stated in $LogArrayDetails that Username goes first.
You've used a hashtable (@{ ... }
) to declare the columns for the output CSV, but the order in which a hashtable's entries are enumerated is not guaranteed.
In PSv3+, you can use an ordered hashtable instead ([ordered] @{ ... }
) to achieve predictable enumeration, which you also get if you convert the hashtable to a custom object by casting to [pscustomobject]
, as shown above.
If you do want to append to the existing file, you can use the following, but note that:
this approach does not scale well, because the entire log file is read into memory every time (and converted to objects), though limiting the entries to a month's worth should be fine.
as stated, the approach is brittle, as things can go wrong while rewriting the file; consider simply writing 2 rows per execution instead, which allows you to append to the file line by line.
there's no concurrency management, so the assumption is that only ever one instance of the script is run at a time.
$FilePath = './t.csv'
$TimeStamp = (Get-Date).toString("dd/MMM/yyyy HH:mm:ss")
$env:USERNAME = $env:USER
# Log start of execution. Note the empty 'FinishDate' property
# to ensure all rows ultimately have the same column structure.
[pscustomobject] @{ Username = $env:USERNAME; StartDate = $TimeStamp; FinishDate = '' } |
Export-Csv -Notypeinformation -Append $FilePath
# Perform script actions...
# Log end of execution:
# Read the entire existing file...
$logRows = Import-Csv $FilePath
# ... update the last row's .FinishDate property
$logRows[-1].FinishDate = (Get-Date).toString("dd/MMM/yyyy HH:mm:ss")
# ... and rewrite the entire file, keeping only the last 30 entries
$logRows[-30..-1] | Export-Csv -Notypeinformation $FilePath