I have this PowerShell script
$CSV = Import-Csv "records.csv"
Foreach ($Row in $CSV) {
$Q = "INSERT INTO database..table([id], [FirstName], [LastName]) VALUES ('"+ ($Row.'id') + "','" + ($Row.'First Name') + "','" + ($Row.'Last Name') + "')"
Invoke-QsSqlQuery -query $Q -SQLServer <servername> -database <databaseName>
}
Note: Invoke-QsSqlQuery
is my own function.
My problem is that I am invoking a SQL command for every row. This has created a performance issue.
I want to build $Q
so that it will have 1000 rows and then invoke it into SQL server. A bulk is not possible because, with bulk, the file needs to be copied locally on the server and that is not allowed.
With a do while
I can count to 1000 that's not to hard but what if my remaining record are less than 1000?
How can I build a query that will update multiple records at once?