1

Should I be concerned about IN expressions with hundreds of values in the array? For example:

myCriteria.Add(Restrictions.In("Id", myArrayOfHundredsOfItems);

Which results in the sql:

select * from MyTable where Id in (1,2,3,4, ...etc .. 900)

MSDN says "Including an extremely large number of values (many thousands) in an IN clause can consume resources and return errors 8623 or 8632."

OK, so 'many thousands' is out of the question. Is there any reason to avoid an array of maybe a few hundred item?

cbp
  • 25,252
  • 29
  • 125
  • 205
  • Can you clarify your scenario? I think this is one of those, "It depends" questions. There are likely better ways to do what you're trying to do, but there's no way to give a constructive answer without understanding the scenario. – csano May 17 '11 at 06:52

1 Answers1

3

There is another limit. The values in the IN clause are passed as single parameters. There is a limit of parameters. It is probably the same as for stored procedures which is 2100.

I have a similar situation where I pass a large amount of Guids. I split it up in several queries, each of them getting up to 1000 parameters. The queries are executed as multi criteria in a batch, so there is only a single database roundtrip.

Performance may suffer when using IN compared to joins. In my scenario I don't have any performance issues because of that. If you don't have that many parameters, it most probably won't be an issue for you.

By the way, getting objects by id could also be implemented by:

  • Get<Entit>(id), which requires a single query, but only when the entity isn't in the cache yet.
  • Load<Entity>(id), which creates a proxy if the entity isn't in cache and only loads in when accessing properties. It most probably makes us of batch-size and loads several instances at once (pre-fetching).

Unfortunately, there isn't a future-get.

Community
  • 1
  • 1
Stefan Steinegger
  • 63,782
  • 15
  • 129
  • 193