3

I have this PowerShell script

 $CSV = Import-Csv "records.csv"
        Foreach ($Row in $CSV) {
            $Q = "INSERT INTO database..table([id], [FirstName], [LastName]) VALUES ('"+ ($Row.'id') + "','" + ($Row.'First Name') + "','" + ($Row.'Last Name') + "')"
            Invoke-QsSqlQuery -query $Q -SQLServer <servername> -database <databaseName>
            }

Note: Invoke-QsSqlQuery is my own function.

My problem is that I am invoking a SQL command for every row. This has created a performance issue.

I want to build $Q so that it will have 1000 rows and then invoke it into SQL server. A bulk is not possible because, with bulk, the file needs to be copied locally on the server and that is not allowed.

With a do while I can count to 1000 that's not to hard but what if my remaining record are less than 1000?

How can I build a query that will update multiple records at once?

Matt
  • 45,022
  • 8
  • 78
  • 119
H. Schutte
  • 51
  • 3
  • This can be done from what I have seen [here](http://stackoverflow.com/questions/2624713/how-do-i-insert-multiple-rows-without-repeating-the-insert-into-dbo-blah-part). What version of sql server are you using. You might have a record limit that needs to be accounted for – Matt Jul 24 '15 at 15:29

2 Answers2

2

This should be fairly simple

  1. Assign all the statements to a string array (your foreach loop is totally appropriate here)
  2. Split the string array into multiple string arrays of 1000 or less
  3. Concatenate each group
  4. Execute query

$CSV = Import-Csv "records.csv"

$SQLServer   = "dbserver.corp.company.tld"
$SQLDatabase = "database"

# Set up a string format template
$InsertTemplate = "INSERT INTO database..table([id], [FirstName], [LastName]) VALUES ('{0}','{1}','{2}')"

# Generate all insert statements and store in string array
$AllInserts = foreach($Row in $CSV){
    $InsertTemplate -f $Row.id,$Row.'First Name',$Row.'Last Name'
}

# Split array into an array of 1000 (or fewer) string arrays
$RowArrays = for($i=0; $i -lt $AllInserts.Length; $i+=1000){
    ,@($AllInserts[$i..($i+999)])
}

# Foreach array of 1000 (or less) insert statements, concatenate them with a new line and invoke it
foreach($RowArray in $RowArrays){
    $Query = $RowArray -join [System.Environment]::NewLine
    Invoke-QsSqlQuery -query $Query -SQLServer $SQLServer -database $SQLDatabase
}
Mathias R. Jessen
  • 157,619
  • 12
  • 148
  • 206
  • I _was_ working on building a statement with one INSERT statement. Since I don't know I am curious: Would the single insert be faster that multiples you have? Or is it 6 of one and half a dozen of the other? – Matt Jul 24 '15 at 15:41
  • @Matt Thought about that as well, but then I glanced [over this](http://stackoverflow.com/questions/8635818/multiple-insert-statements-vs-single-insert-with-multiple-values) - seems that it might cause a huge increase in compile time – Mathias R. Jessen Jul 24 '15 at 15:51
  • Still, should be trivial to do the other way around, just remove the "INSERT INTO" part from the template and prepend it to the final query string before execution – Mathias R. Jessen Jul 24 '15 at 15:52
  • Iam getting an error.Cannot convert value "INSERT INTO " to type "System.Int32". Error: "Input string was not in a correct fo rmat." At line:2 char:16 + ,@($AllInserts[$i]..$AllInserts[$i+999]) + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : InvalidArgument: (:) [], RuntimeException + FullyQualifiedErrorId : InvalidCastFromStringToInteger' – H. Schutte Jul 24 '15 at 17:22
  • Change $AllInserts.Length to $AllInserts.Count – Mathias R. Jessen Jul 24 '15 at 17:25
  • And enclose $AllInserts[] in parentheses – Mathias R. Jessen Jul 24 '15 at 17:26
  • sorry i dont know what you mean by "And enclose $AllInserts[] in parentheses" – H. Schutte Jul 24 '15 at 17:39
  • $AllInserts[$i+999] becomes ($AllInserts[$i+999]) – Mathias R. Jessen Jul 24 '15 at 17:41
  • Sill getting error this is my code now: ' # Split array into an array of 1000 (or fewer) string arrays $RowArrays = for($i=0; $i -lt $AllInserts.Count; $i+=1000){ ,@($AllInserts[$i]..($AllInserts[$i+999])) # write-host $AllInserts }' – H. Schutte Jul 24 '15 at 17:46
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/84209/discussion-between-h-schutte-and-mathias-r-jessen). – H. Schutte Jul 24 '15 at 17:49
1

depending on some factors you may be able to use bulk insert instead of multiple insert.

the requirements to satisfy are 2:
- the file to import must be on the sql server
- CSV files must comply with specific requirements stated on msdn for the format of the file to be imported

should you be able to comply with the above requirements you may be able to import the whole file with a statement simple as this one:

BULK INSERT database..table
FROM 'C:\FileToImport.csv';
GO
Paolo
  • 2,224
  • 1
  • 15
  • 19
  • this is the quickes way but i can not copy the file to the sql server. like i said. but thx anyway for thinking with me for a solution – H. Schutte Jul 24 '15 at 17:57