0
declare @List   varchar(25) = '2,3,4'
declare @Delinquencies  table (id int);
insert into @Delinquencies(id) values('2'),('3'),('4'); --Line in question

@List is being populated with a string populated from an SSRS report for which choices they have picked. Now the way my stored procedure is running, I need to be able insert into my table variable based on what varchar list is coming through. How can I insert into a table variable with a dynamic varchar list? What is listed here is about as close to the testing format as I can come.

I am using SQL Server 2008.

Example

@List = '1'
insert into @Delinquencies(id) values('1')

And any combination up to

@List = '1,2,3,4'
insert into @Delinquencies(id) values('1'),('2'),('3'),('4')
Adam
  • 2,422
  • 18
  • 29
  • 1
    In SQL Server 2016, there is a STRING_SPLIT function. Before that, I guess you will need to write your own function. (There has been example up here of split_string functions. – DVT Aug 10 '16 at 18:43
  • check this one http://stackoverflow.com/questions/32888266/convert-comma-delimited-string-to-table-or-array-in-sql-server-2008-without-usin – Ja9ad335h Aug 10 '16 at 18:44

3 Answers3

4

Using one of the split string functions from here..

declare @List   varchar(25) = '2,3,4';
declare @Delinquencies  table (id int);

;with cte
as
(select * from
[dbo].[SplitStrings_Numbers](@list,',')
)

insert into @Delinquencies(id)
select * from cte
TheGameiswar
  • 27,855
  • 8
  • 56
  • 94
0

u need to create a dbo.StringSplit function

which takes two parameter (string, delimiter)

CREATE function [dbo].[StringSplit](
 @String varchar (max),
 @Delimiter nvarchar (10)
 )
returns @ValueTable table ([Value] varchar(max))
begin
 declare @NextString varchar(max)
 declare @Pos int
 declare @NextPos int
 declare @CommaCheck nvarchar(1)

 --Initialize
 set @NextString = ''
 set @CommaCheck = right(@String,1) 

 --Check for trailing Comma, if not exists, INSERT
 --if (@CommaCheck <> @Delimiter )
 set @String = @String + @Delimiter

 --Get position of first Comma
 set @Pos = charindex(@Delimiter,@String)
 set @NextPos = 1

 --Loop while there is still a comma in the String of levels
 while (@pos <>  0)  
 begin
  set @NextString = substring(@String,1,@Pos - 1)

  insert into @ValueTable ( [Value]) Values (@NextString)

  set @String = substring(@String,@pos +1,len(@String))

  set @NextPos = @Pos
  set @pos  = charindex(@Delimiter,@String)
 end

 return
end

and then you can use it like below

declare @List   varchar(25) = '2,3,4'
SELECT value from dbo.StringSplit(@List,',')
Ja9ad335h
  • 4,995
  • 2
  • 21
  • 29
  • There is no need to resort to looping for this. And a table valued function with multiple lines like this will almost always perform even worse than a scalar function. Just don't do it. Here are some far better options. http://sqlperformance.com/2012/07/t-sql-queries/split-strings – Sean Lange Aug 10 '16 at 20:01
0

Here is a parser which returns the sequence as well.

For example:

Select * from [dbo].[udf-Str-Parse]('126,256,512',',')

Returns

Key_PS  Key_Value
1       126
2       256
3       512

The UDF

CREATE FUNCTION [dbo].[udf-Str-Parse] (@String varchar(max),@Delimeter varchar(10))
--Usage: Select * from [dbo].[udf-Str-Parse]('Dog,Cat,House,Car',',')
--       Select * from [dbo].[udf-Str-Parse]('John Cappelletti was here',' ')

Returns @ReturnTable Table (Key_PS int IDENTITY(1,1), Key_Value varchar(max))
As
Begin
   Declare @XML xml;Set @XML = Cast('<x>' + Replace(@String,@Delimeter,'</x><x>')+'</x>' as XML)
   Insert Into @ReturnTable Select Key_Value = ltrim(rtrim(String.value('.', 'varchar(max)'))) FROM @XML.nodes('x') as T(String)
   Return 
End
John Cappelletti
  • 79,615
  • 7
  • 44
  • 66
  • Be careful here. This is known as a multi-statement table valued function and the performance will usually be worse than a scalar function. Convert this to a single statement so it can be an inline table valued function and it would be a super answer. – Sean Lange Aug 10 '16 at 20:02
  • @SeanLange Thanks for the tip. I'll noodle it a bit more. For me, and for now, it is a Cost vs. Convenience thing. I do keep my data normalized as much as possible. I tend to only parse strings from my web apps, but MY GOODNESS there tons of questions on SO regarding parsing. – John Cappelletti Aug 10 '16 at 20:08
  • The only I know of that is set based, fast and returns the row number is the Jeff Moden version. http://www.sqlservercentral.com/articles/Tally+Table/72993/ It is limited to an input of varchar(8000) but the ability to return the row number well outweighs some arguments against it. Some people around here don't like that splitter because it can't receive varchar(max) without massive performance penalties. – Sean Lange Aug 10 '16 at 20:38
  • @SeanLange Just for fun, I put that article through my parser. 38,432 characters or 6,605 words in times ranging from 390ms to 440ms. Then just because I'm me (which ain't always easy... just ask my wife), I also created a table of 3,500 records with 9 to 25 words in each row. This generated 42,297 records/words (by ID) in times between 2.379 and 2.956 seconds. Considering the amount of parsing I do, I'm OK with the performance. That, said, I still very much appreciate getting tips from the heavy hitters. – John Cappelletti Aug 10 '16 at 21:54
  • Sounds like you had some fun testing. Nice. Not sure I am in the "heavy hitters" group but appreciate the feedback. :) – Sean Lange Aug 11 '16 at 13:41