I have a table with many columns defined as nvarchar where the defined length is very much larger than the actual max length of the data stored in it. I know that being a nvarchar, storage isn't a problem. But I am curious if there is a hit on query performance on a table with 32 million rows. The columns in question are not part of any JOINs, WHEREs, GROUP BYs, ORDER BYs, etc.
Additionally, would the same have any performance hit on a SSAS Tabular Model?