-1

In SQL Server, the performance of temp tables is much better (in the means of time) compared to table variables when working with large data (say inserting or updating 100000 rows) (reference: SQL Server Temp Table vs Table Variable Performance Testing)

I've seen many articles comparing temp table and table variable, but still don't get what exactly makes temp tables more efficient when working with large data? Is it just how they are designed to behave or anything else?

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
user1080381
  • 1,597
  • 1
  • 16
  • 22
  • many answers already out there,..... https://stackoverflow.com/questions/11857789/when-should-i-use-a-table-variable-vs-temporary-table-in-sql-server – Mitch Wheat Feb 21 '18 at 10:12

1 Answers1

2

Table variables don't have statistics, so cardinality estimation of table variable is 1.

You can force at least correct cardinality estimation using recompile option, but in no way can you produce column statistics, i.e. there is no data distribution of column values that exists for temporary tables.

The consequences are evident: every query that uses table variable will have underestimation.

Another con is this one:

Queries that insert into (or otherwise modify) @table_variables cannot have a parallel plan, #temp_tables are not restricted in this manner.

You can read more on it here:

Parallelism with temp table but not table variable?

The answer in that topic has another link to additional reading that is very helpfull

sepupic
  • 8,409
  • 1
  • 9
  • 20