Consider having a table like this:
CREATE TABLE Product (
Id int PRIMARY KEY CLUSTERED,
InvoicesStr varchar(max)
)
which InvoicesStr
is concatenated Ids of the invoices containing this Product.
I know it is not a proper design, but I just brought it up to demonstrate the problem I want to describe.
So the table data would be something like this:
Product
Id | InvoicesStr
----|-------------------------------------
1 | 4,5,6,7,34,6,78,967,3,534,
2 | 454,767,344,567,89676,4435,3,434,
After selling millions of products the InvoicesStr
would contain a very large string.
Consider a situation in which for a row, this column contains a very big string, say a 1GB string.
I want to know about the performance for such an update query:
UPDATE Product
SET InvoiceStr = InvoiceStr + '584,'
WHERE Id = 100
Is the performance of this query dependent on the size of InvoiceStr
? Or is SQL Server smart enough to just append the new string and not replace it completely?