8

I've got quite a few SQL statements like such:

SELECT foo FROM things WHERE user_id IN (1,2,3..n)

Is there a known limit to the number of elements that will safely fit in an IN clause like that?

Kyle
  • 1,054
  • 2
  • 14
  • 27

6 Answers6

7

The 1000 limit in PostgreSQL is not a hard limit, it is a optimization limit, i.e; after 1000 PostgreSQL doesn't handle it very well. Of course I have to ask what in the world are you doing with a 1000 entry IN clause.

Joshua D. Drake
  • 1,026
  • 5
  • 6
5

For MySQL, from the manual:

The number of values in the IN list is only limited by the max_allowed_packet value.

D'Arcy Rittich
  • 167,292
  • 40
  • 290
  • 283
4

There is no technical limit, but there is a some kind of 'good sense' limit..

Having too much elements in the IN clause means the query have probably a bad design (imho)

Strae
  • 18,807
  • 29
  • 92
  • 131
  • You can almost always replace IN with a some JOIN statement, unless you have the list from external (non-SQL) source. – SF. Mar 11 '10 at 15:47
  • 1
    and if the list is external, you can always insert it into a temporary table and then do a join, which maybe after creating some indices might turn out to be a good plan – araqnid Mar 12 '10 at 16:25
  • @araqnid: can the views be used with external source? – Strae Mar 12 '10 at 18:26
  • not directly, without some magic, but you could create a temp table, do a COPY to populate it from your app, index/analyze the temp table and do a join. Long-winded, and only useful if it's a lot of values that have to be sent over. Of course, if you do need some sort of external source and you can write a function to obtain those values, that will work too- either use the fn directly in the query or spool it to a temp table. – araqnid Mar 13 '10 at 12:05
  • @Strae in PostgreSQL 9.1 there is a possibility to use external data as a table. See http://wiki.postgresql.org/wiki/Foreign_data_wrappers. – Roman-Stop RU aggression in UA Nov 19 '13 at 12:31
2

I've used it, in dynamic queries created for postgres with sqlalchemy, with over 25k parameters. Also, they were passed to python functions via positional parameters (*args).... but I didn't notice a slowdown in my case. YMMV

Marco Mariani
  • 13,556
  • 6
  • 39
  • 55
0

No, but be careful when using the IN statement. If you use a subquery in your IN statement, performance might be negatively impacted because SQL Server must generate the entire result set and essentially build a potentially large IF statement internally.

For example, something like Select * From MyTable where MyColumn IN (Select myColumn from AnotherTable) might be somewhat slow if the subquery returns a large number of rows. Often times it's more efficient to use EXISTS.

Randy Minder
  • 47,200
  • 49
  • 204
  • 358
0

I'm pretty sure Postgres has a limit of 1000....can't find any docco to support that though.

simonlord
  • 4,347
  • 1
  • 19
  • 12