0

We decided to use Nvarchar to store some information in some tables, the reason is that we assumed that we will have a lot of special characters since the database contains French and German data.

After extracting some of the data, we estimated the full run to have and extreamlly huge size (20 TB)

Now we would like to check every table to find if there are special characters found, if not then we change the type from NvarChar to Varchar

Any idea how can we query this?

or what do you think the best way to optimize the size?

Maro
  • 2,579
  • 9
  • 42
  • 79
  • http://stackoverflow.com/questions/2558755/how-to-detect-if-a-string-contains-special-characters – Pamma Mar 15 '13 at 22:17
  • I think you should write a procedure to query every column of every table for those special characters. A quick google turned up a pretty good one that you could modify a little bit to what you need. http://vyaskn.tripod.com/search_all_columns_in_all_tables.htm It creates a temporary table that will store the table and column name where matches are found. – Scotch Mar 15 '13 at 22:18

3 Answers3

4

Had to solve this a little while ago myself:

Use regex:

LIKE '%[^a-zA-Z0-9]%'

To solve the problem of searching the tables, try: How do I find a value anywhere in a SQL Server Database? , if not I got a better one somewhere that I use.

Community
  • 1
  • 1
RandomUs1r
  • 4,010
  • 1
  • 24
  • 44
1

Actually a much better idea is to use a compressed table. SQL Server 2008 has page compression and SQL Server 2008 R2 adds Unicode compression on top of that.

Remus Rusanu
  • 288,378
  • 40
  • 442
  • 569
0

select Name from table where Name like '%[^0-9a-zA-Z ]%'

mohd Bilal
  • 61
  • 1
  • 4