0

I'm a bit confused about calculating the max possible size of a record in sql server. I'm getting this error when I add an int column to a table in the application I'm working on (a table I didn't create/that is new to me):

Warning: The table "myTable" has been created, but its maximum row size exceeds the allowed maximum of 8060 bytes. INSERT or UPDATE to this table will fail if the resulting row exceeds the size limit.

I've tried the methods suggested here to figure out what is the max record size in my table, but they get very different results:

1)

SELECT sum(max_record_size_in_bytes) as maxSizeSum 
FROM sys.dm_db_index_physical_stats (DB_ID(N'MyDb'), OBJECT_ID(N'[dbo].[myTable]'), NULL, NULL , 'DETAILED')

This returns 5712 as maxSizeSum, so if this is right then the error I get on adding a field would have to be wrong...

2)

dbcc showcontig ('myTable') with tableresults

This gets 3976, which isn't surprising given the comment in the post that it only measures current data, not possible data

3)

SELECT OBJECT_NAME (id) tablename
 , COUNT (1)        nr_columns
 , SUM (length)     maxrowlength
FROM   syscolumns
GROUP BY OBJECT_NAME (id)
ORDER BY OBJECT_NAME (id)

The line returned for myTable has maxrowlength of 52498. I guess that's possible but it's WAAAAY above the allowed length and the other measurement methods.

4) This post has another method, which also gets 52498, so I'm leaning towards thinking that's the 'real' answer, but am hoping for confirmation given the differences of opinion:

Select  schema_name(T.schema_id) As SchemaName,
        T.Name As TableName,
        Sum(C.max_length) As RowSize
From    sys.tables T
        Inner Join sys.columns C
            ON T.object_id = C.Object_ID
        INNER JOIN sys.types S
            On C.system_type_id = S.system_type_Id
    Where T.Name = 'myTable'
Group By schema_name(T.schema_id),
        T.Name
Order By schema_name(T.schema_id),
        T.Name

I don't have any nvarchar fields in my database, so that isn't a concern for me. Are the last 2 methods right and the others wrong?

Many thanks!!

Community
  • 1
  • 1
user756366
  • 467
  • 6
  • 24
  • but you aren't receiving an error, just a warning – Lamak Jan 03 '17 at 18:04
  • Why don't you post the table definition somewhere? There is more to row size than *just* the potential length of each column. Each row has overhead for NULL bitmaps, lengths of variable data types, etc. Those don't explain 52498 but without seeing the table definition we can only guess. – Aaron Bertrand Jan 03 '17 at 19:47
  • Thanks! Unfortunately the specific table is confidential, so I can't post it. I was hoping that there would be generic explanations that would apply to all tables - sounds from what you're saying like NULL bitmaps and the variable data types impact that, but I would hope there would be a limited number of such things for the data types and values that SqlServer supports. Are you saying there is no formula/query that could generically (i.e. applies to all table definitions) calculate the max row length of a row for any sql server table based on queries that access that table's metadata? – user756366 Jan 19 '17 at 20:32

1 Answers1

0

The comment is the most telling.

You are not exceeding the maximum record size. Your table could exceed the maximum record size, given the definitions of the columns.

As you have demonstrated looking at the actual data size, your operation appears to be safe -- for the time being.

Gordon Linoff
  • 1,242,037
  • 58
  • 646
  • 786
  • Thanks - yes, I do understand that I don't currently have data in my table on my local machine that exceeds the maximum row size, but the app is a hosted one for clients, so any of our clients could exceed that maximum row size at any time - therefore I want to understand how to determine whether the table definition allows clients to exceed the possible row size, and by how much. – user756366 Jan 19 '17 at 20:34