0

I have a table with many columns defined as nvarchar where the defined length is very much larger than the actual max length of the data stored in it. I know that being a nvarchar, storage isn't a problem. But I am curious if there is a hit on query performance on a table with 32 million rows. The columns in question are not part of any JOINs, WHEREs, GROUP BYs, ORDER BYs, etc.

Additionally, would the same have any performance hit on a SSAS Tabular Model?

Tab Alleman
  • 31,483
  • 7
  • 36
  • 52
Bill T
  • 1
  • 1
  • What do you mean by "a hit on query performance"? I am not sure what you are asking here. Are you joining on this column or just selecting it? Give us a little bit to work with here. – Sean Lange Feb 25 '16 at 19:41
  • 1
    http://stackoverflow.com/questions/2009694/is-there-an-advantage-to-varchar500-over-varchar8000 – Bruce Dunwiddie Feb 25 '16 at 20:04

0 Answers0