0

Say you have a table where one column has repeat values. How can I add another column that shows how many times that value has shown up SO FAR (top-down).

Ex. You have a column say "ccode" and in ccode you have the value "R52" repeat twice. Rather than Join the final count (2), I want the first appearance of R52 to have a count=1, and the second to have a count=2, and so on...

CREATE TABLE Temp
(
    ccode  varchar(50),
    name   varchar(50),
    Val1   varchar(50),
    g_Name varchar(50),
    ce_hybrid varchar(50)
 )
 
 INSERT INTO Temp VALUES
 ( 'R52'  ,  'adam@email.ca'  , 1, 'WALT', '3P'),
 ( 'R52'   ,  'adam@email.ca' , 2 , 'KEN', '3P'),
 ( 'R00'  ,  'alison@email.ca'  , 1 , 'QUIN', '3P')

SELECT ccode, name, [1_G_Name], [2_G_Name], [1_Hybrids], [2_Hybrids] FROM
(
SELECT ccode, name, col, val FROM(
 SELECT *, Val1+'_G_Name' as Col, g_Name as Val FROM Temp
 UNION
 SELECT *, Val1+'_Hybrids' as Col, ce_hybrid as Val FROM Temp
) t
) tt
PIVOT ( max(val) for Col in ([1_G_Name], [2_G_Name], [1_Hybrids], [2_Hybrids]) ) AS pvt

For a better idea: http://sqlfiddle.com/#!18/6160d/2

I want to have a table like above, but add Val1 column afterwards (dynamically) based on the repeats SO FAR in the table (top-down).

This output (image below) is CORRECT. But say my table didn't have Val1 column:

 INSERT INTO Temp VALUES
 ( 'R52', 'adam@email.ca', 'WALT', '3P'),
 ( 'R52', 'adam@email.ca', 'KEN', '3P'),
 ( 'R00', 'alison@email.ca', 'QUIN', '3P')

How would I add Val1 column with the (1 , 2 , 1) to based on repeat count as I mentioned

Required Output: enter image description here

swazzy123
  • 27
  • 7
  • So, for the sample data you have given us, what are your expected results? – Thom A Nov 12 '20 at 17:32
  • @Larnu The expected results are exactly what appears when you open the link in the bottom. The results are right, but they way I get them is wrong. I don't want to hardcode (1,2,1) for Val1. I want to find a way to join those values afterwards based on the number of times there is a repeat. Ex. R52 repeats twice, they first time it occurs it gets Val1=1, second times Val1=2, .... – swazzy123 Nov 12 '20 at 18:14
  • *"The results are right, but they way I get them is wrong"* If they are right, how can they be wrong..? – Thom A Nov 12 '20 at 18:17
  • @Larnu I have a lot of data and I can't go and hardcode the Val1 column, I need to add it dynamically and still get the same behavior (I updated my question if that helps). Sorry I know its confusing. Also, if there is a better way to get that output with just the second insert statement in the question (before image), that would also work. Noob here so my bad if it doesn't make sense – swazzy123 Nov 12 '20 at 18:34
  • So are you asking how to do a [dynamic pivot](https://stackoverflow.com/q/10404348/2029983)? – Thom A Nov 12 '20 at 18:36
  • @Larnu pretty much. But do you see how in that question there is a category column? That is my Val1 column. I need to also add that dynamically based on the repeats in the date column in that question for example. – swazzy123 Nov 12 '20 at 18:44
  • I think we need better sample data and expected results here. – Thom A Nov 12 '20 at 18:46

1 Answers1

0

I got an answer thanks to an amazing senior developer at work. I would feel bad if I didn't share so:

SELECT *, rank() over (partition by ccode order by g_name) Val1 FROM Temp

Use Rank() and Partition over the table. I partitioned by ccode so any matching/duplicating will start from 1 and add 1 each time the same ccode appears in the table.

Example 1: http://sqlfiddle.com/#!18/95a0d5/6

Example 2: http://sqlfiddle.com/#!18/569d8/1

Example 3: http://sqlfiddle.com/#!18/41bf32/1

In example 3, notice how since we used order by g_name and there are 2 identical names KEN and KEN for ccode=R52, the Val1 is 2 and 2 for them and 4 next time (3 gets skipped)

I ignored the rest of the code regarding pivot since my question was more regarding this rank/partition. I'm not super familiar with it other than what was explained over a call, but hope it helps someone.

P.S. what would be a better name for this question?

swazzy123
  • 27
  • 7