1

I have a table with 20 million rows that includes 4 columns with data types:

  • nvarchar(20)
  • nvarchar(255)
  • nvarchar(255)
  • nvarchar(max)

Since I'm not storing unicode characters, I should be able to convert to varchar and use only half the space, right?

ALTER TABLE myTable ALTER COLUMN myColumn1 varchar(20) not null; 
ALTER TABLE myTable ALTER COLUMN myColumn2 varchar(255) not null; 
ALTER TABLE myTable ALTER COLUMN myColumn3 varchar(255) not null; 
ALTER TABLE myTable ALTER COLUMN myColumn4 varchar(max) not null; 
UPDATE STATISTICS Production.Product(Products) WITH FULLSCAN, NORECOMPUTE;

Well, after altering the columns and updating the statistics, it appears the table actually got larger from 9GB to 13GB. Any thoughts on how this could be?

Here's how I'm determining table size:

https://stackoverflow.com/a/7892349/7274782

Community
  • 1
  • 1
  • How are you doing the conversion? Adding information about the process you used will greatly help in answering this question. – Kateract Dec 09 '16 at 23:34
  • ALTER TABLE myTable ALTER COLUMN myColumn varchar(20) not null; UPDATE STATISTICS Production.Product(Products) WITH FULLSCAN, NORECOMPUTE; – dmgutenkauf Dec 09 '16 at 23:51
  • 1
    Please [edit] your question and include this information. – Kateract Dec 09 '16 at 23:59

0 Answers0