I am migrating a MYSQL db to MSSQL.
I have a MYSQL column with datatype varchar(8192)
.
MSSQL varchar(8000)
seems like the best in terms of performance but there could be data that's between 8000 and 8192.
I am curious as to what the most performance efficient MSSQL datatype would be to use for this.
Is my only way to look at this is asking whether I have data that's actually between 8000 and 8192 and how often it is looked up?
If its just a few rows, then use MSSQL Varchar(max)
else if a lot then what do I do?
AFAIK, if I have a table with rows where data is MSSQL varchar < 8000, those will be treated as rows, and any > 8000 will be lob?
Will MSSQL have different search strategies, like first search all rows where size < 8000 first and then search the other place for where the data is > 8000?
If that's the case what if I have 1000 rows of < 8000 and 1 row > 8000 which is first in the database, wont that be a greater performance hit because it would have went through 1001 rows.
Indexing should help, is it advised?