-2

I am currently working on a project which collects a list of IDs (belonging to 2 tables) from a user selection, which requires further filtering based on whatever the user has selected. In order to do this, I need to send a collection of IDs as a parameter (comma separated values) to the stored procedure. But as I understand, a SQL Server stored procedure can only accept up to 4000 characters in a parameter, beyond which it truncates them...

Is there any way I could efficiently do this without a lot of overhead on the database and without having to cache anything in the client side?

Thanks for your suggestions.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
  • 9
    Passing millions of values into a single parameter in itself is a lot of overhead... You need to rethink whatever it is you are trying to do. – user1666620 May 19 '16 at 13:49
  • 2
    You can use **Table Value Parameter** in SQL Server for that. – Krishnraj Rana May 19 '16 at 13:51
  • Duplicate? http://stackoverflow.com/questions/337704/parameterize-an-sql-in-clause – 3615 May 19 '16 at 13:52
  • Can you show some sample data and way you are tyring to ahcieve that – TheGameiswar May 19 '16 at 13:55
  • You can split on comma and set the values to a user defined table type supported by SQL server as parameter. – PSK May 19 '16 at 13:58
  • Have you considered `bulk insert` your id's into a temporary table with a randomly generated key, and simply sending the key to your id's to the stored procedure? – Louis May 19 '16 at 13:59
  • 1
    If you are using sql server 2008 or later (which you should, since older versions are really, really old by now), go with a table valued parameter. – Zohar Peled May 19 '16 at 14:06
  • SQL Server has data types *designed* for holding multiple values - table-valued parameters, as already mentioned, or XML. Deciding to stuff everything into a string as "comma separated values" just gives you more work to do in extracting values. That being said, if you truly are working with millions of values, I'd suggest you take a step back and consider the feasibility of what you're trying to do here. – Damien_The_Unbeliever May 19 '16 at 14:09
  • @user1666620: I know, but this is a legacy system and provides little flexibility to use table valued parameters and so on. It has been designed this way ages ago and there's not much we can do about it now. It's a heavy parent-child relationship and we need to send all parent + children IDs for each parent they select. – Computer Scientist May 19 '16 at 14:48
  • @ComputerScientist perhaps this is a good time to seriously consider a rewrite of this portion of the application? If it cannot be easily extended to handle your new requirements, and is as old as you suggest, then maybe it has outlived its usefulness. – user1666620 May 19 '16 at 15:40
  • @user1666620: I wish it was as easy as you pointed out. The code base has over millions of lines, > 10k users, and the system is used across multiple countries. A lot of developers work on it everyday and yes, it's being improved slightly each day - but given the time constraints and other complexities, it's not as easy. Thanks for your suggestion anyway. – Computer Scientist May 19 '16 at 19:42

2 Answers2

0

The normal way to send a table of data to SQL Server is using a Table type parameter. I'm not sure I'd recommend using this for millions of values since its a RAM based exercise. The alterative is to use SQL Bulk Insert into a temporary holding table, and accessing your data via the holding table instead of a formal parameter.

PhillipH
  • 6,182
  • 1
  • 15
  • 25
0

You should probably use a fake temporary table that is reserved for this purpose. Insert all of your target IDs into the fake table (either one at at time or in chunks) from your client app. Then, when they are all in there, do your join against the fake table and return your results. When you're done, clear out the fake table.

Ash8087
  • 643
  • 6
  • 16