I am writing a simple script (as I thought) to replace some strings in CSV files. Those strings are so called "keys" of objects. I basically replace the "old key" in the files with a "new key".
function simpleStringReplacement {
param (
$sourceFiles, # list of csv files in which we do need to replace contents
$mappingList, # a file that contains 2 columns: The old key and the new key
$exportFolder, # folder where i expect the results
$FieldsToSelectFromTargetFilesIntoMappingFile # As the names of the fields that contain the values for replacements change, i have that in this array
)
$totalitems = $sourceFiles.count
$currentrow = 0
Write-Output "Importing mapper file $mappingList" | logText
$findReplaceList = Import-Csv -Path $mappingList -Delimiter ';'
foreach ($sourceFile in $sourceFiles) {
$currentrow += 1
Write-Output "Working on $currentrow : $sourceFile" | logText
[string] $txtsourceFile = Get-Content $sourceFile.FullName | Out-String
$IssueKey = $FieldsToSelectFromTargetFilesIntoMappingFile[0]
$OldIssueKey = $FieldsToSelectFromTargetFilesIntoMappingFile[1]
ForEach ($findReplaceItem in $findReplaceList) {
$txtsourceFile = $txtsourceFile -replace $findReplaceitem.$OldIssueKey , $findReplaceitem.$IssueKey
}
$outputFileName = $sourceFile.Name.Substring(0, $sourceFile.Name.IndexOf('.csv') ) + "_newIDs.csv"
$outputFullFileName =Join-Path -Path $exportFolder -ChildPath $outputFileName
Write-Output "Writing result to $currentrow : $outputFullFileName" | logText
$txtsourceFile | Set-Content -path $outputFullFileName
}
}
The issue I have: already when the script is working on the first file (first iteration of the outer loop) i get:
Insufficient memory to continue the execution of the program.
And this error is referencing my code line with the replacement:
$txtsourceFile = $txtsourceFile -replace $findReplaceitem.$OldIssueKey , $findReplaceitem.$IssueKey
The csv files are "big" but really not that big..
The mappingList is 1.7 MB
Each Source File is around 1.5 MB
I can't really understand how i run into memory issues with these file sizes. And ofc. I have no idea how to avoid that problem
I found some blogs talking about memory issues in PS. They all end up changing the PowerShell MaxMemoryPerShellMB quota defaults. That somehow doesn't work at all for me as I run into an error with
get-item WSMAN:\localhost\shell\MaxMemoryPerShellMB
Saying "get-item : Cannot find path 'WSMan:\localhost\Shell\MaxMemorPerShellMB' because it does not exist."
I am working in VS Code.