I'm trying to find a way to gather data before export to take additional data from a file and consolidate then export.
My code is like: looks for users and computers from many sources and consolidate data → create array of 2 column (name,computer) → export that data to output.log
Because the data I'm looking for dynamically changes from time to time I wish to ran the script multiple times a day, so next time run get the data from output.log into array → continue gathering new data and ADD them into the existent output.log.
At the moment I'm stuck where every time I run the code it overwrites the output.log.
My code is like:
Set-Variable -Name Computer -Value @("pc1","pc2")
Set-Variable -Name LogNames -Value @("something")
$el_c = @()
foreach ($comp in $Computer) {
foreach ($log in $LogNames) {
$el = ... # get data I need from $comp
$el_c += $el #consolidating
}
}
$el_c | %{
$_ | select @{n='Name';e={$_.Properties[0].value}}, @{n='Computer';e={$_.Properties[1].value}}
} | Export-Csv "C:\test\OutputRaw.log"
$input = 'C:\test\OutputRaw.log' #TO FILTER OUT DUPLICATION
$inputCsv = Import-Csv $input | Sort-Object * -Unique
$inputCsv | Export-Csv "C:\test\OutputFinal.log" -NoTypeInformation
Output is:
"Name","Computer" "Dan","PC1" "Tom","PC2"
How can I implement that before extract to file ALSO get the data from "output.log" and merge/add/consolidate into the newly gathered data?