12

I'm using MS SQL 2008 R2, have three tables with following schema:

Table 1: Contains workshift info for each worker

CREATE TABLE workshift (
[ws_id] [bigint] NOT NULL,
[start_date] [datetime] NOT NULL,
[end_date] [datetime] NOT NULL,
[worker_id] [bigint] NOT NULL
)

INSERT INTO workshift VALUES (1, '2012-08-20 08:30:00', '2012-08-20 14:30:00', 1)
INSERT INTO workshift VALUES (2, '2012-08-20 14:30:00', '2012-08-20 22:30:00', 2)

Table 2: Contains monetary denominations

CREATE TABLE currency_denom (
[cd_id] [decimal](7, 2) NOT NULL,
[name] [nchar](100) NOT NULL
)

INSERT INTO currency_denom VALUES (1, '100.00')
INSERT INTO currency_denom VALUES (2, '50.00')
INSERT INTO currency_denom VALUES (3, '20.00')
INSERT INTO currency_denom VALUES (4, '10.00')
INSERT INTO currency_denom VALUES (5, '5.00')
INSERT INTO currency_denom VALUES (6, '1.00')

Table 3: Contains the quantity of each denomination the worker has received in every workshift

CREATE TABLE currency_by_workshift (
[cd_id] [decimal](7, 2) NOT NULL,
[ws_id] [bigint] NOT NULL,
[qty] [int] NOT NULL
)

INSERT INTO currency_by_workshift VALUES (1, 1, 1)
INSERT INTO currency_by_workshift VALUES (2, 1, 2)
INSERT INTO currency_by_workshift VALUES (3, 1, 2)
INSERT INTO currency_by_workshift VALUES (2, 2, 3)
INSERT INTO currency_by_workshift VALUES (4, 2, 4)
INSERT INTO currency_by_workshift VALUES (5, 2, 2)

I need to get the currency_by_workshift values in columns instead of rows, along with the workshift values, that is:

workshift |     workshift       |     workshift       | 100.00 | 50.00 | 20.00 | 10.00 | 5.00 | 1.00 
  ws_id   |     start_date      |     end_date        |        |       |       |       |      | 

    1     | 2012-08-20 08:30:00 | 2012-08-20 14:30:00 |    1   |   2   |   2   |   0   |   0  |   0
    2     | 2012-08-20 14:30:00 | 2012-08-20 22:30:00 |    0   |   2   |   0   |   4   |   2  |   0

I'm not able to use a case to count quantities for each currency denomination because they are configurable, if a new denomination is added, the query should be modified. Same applies if using PIVOT function, or I'm wrong?

How can I get the info that way?

RichardTheKiwi
  • 105,798
  • 26
  • 196
  • 262
Alex
  • 123
  • 1
  • 1
  • 6
  • 1
    Try searching for dynamic pivot - effectively, you create a pivot using dynamic sql – podiluska Aug 22 '12 at 14:08
  • The question has been placed a few more times on SO. Try searching for PIVOT. You can find a similar problem here http://stackoverflow.com/questions/10976585/convert-rows-to-columns-using-pivot-in-mssql-when-columns-are-string-data-type – Dumitrescu Bogdan Aug 22 '12 at 14:10

2 Answers2

18

What you are trying to do is called a PIVOT. There are two ways to do this, either with a Static Pivot or a Dynamic Pivot.

Static Pivot - is where you will hard-code the values of the rows to transform to columns (See SQL Fiddle with Demo):

select ws_id,
  start_date,
  end_date,
  IsNull([100.00], 0) [100.00],
  IsNull([50.00], 0) [50.00],
  IsNull([20.00], 0) [20.00],
  IsNull([10.00], 0) [10.00],
  IsNull([5.00], 0) [5.00],
  IsNull([1.00], 0) [1.00]
from 
(
  select ws.ws_id,
    ws.start_date,
    ws.end_date,
    cd.name,
    cbw.qty
  from workshift ws
  left join currency_by_workshift cbw
    on ws.ws_id = cbw.ws_id
  left join currency_denom cd
    on cbw.cd_id = cd.cd_id
) x
pivot
(
  sum(qty)
  for name in ([100.00], [50.00], [20.00], [10.00], [5.00], [1.00])
) p

Dynamic pivot is where the columns are determined at run-time (see SQL Fiddle with Demo):

DECLARE @cols AS NVARCHAR(MAX),
    @query  AS NVARCHAR(MAX),
    @colsPivot AS NVARCHAR(MAX)

select @colsPivot = 
  STUFF((SELECT ', IsNull(' + QUOTENAME(rtrim(name)) +', 0) as ['+ rtrim(name)+']' 
                    from currency_denom
                   GROUP BY name
                   ORDER BY cast(name as decimal(10, 2)) desc
            FOR XML PATH(''), TYPE
            ).value('.', 'NVARCHAR(MAX)') 
        ,1,1,'')

select @cols = STUFF((SELECT distinct ', ' + QUOTENAME(name)
                    from currency_denom
            FOR XML PATH(''), TYPE
            ).value('.', 'NVARCHAR(MAX)') 
        ,1,1,'')


set @query 
      = 'SELECT ws_id, start_date, end_date,' + @colsPivot + ' from 
         (
            select ws.ws_id,
              ws.start_date,
              ws.end_date,
              cd.name,
              cbw.qty
            from workshift ws
            left join currency_by_workshift cbw
              on ws.ws_id = cbw.ws_id
            left join currency_denom cd
              on cbw.cd_id = cd.cd_id
         ) x
         pivot 
         (
            sum(qty)
            for name in (' + @cols + ')
         ) p '

execute(@query)

Both versions will produce the same results.

Taryn
  • 242,637
  • 56
  • 362
  • 405
  • +1 - I knew when I saw this question that you were gonna answer it – Lamak Aug 22 '12 at 14:21
  • @Lamak what can I say, I like PIVOTs. :) – Taryn Aug 22 '12 at 14:22
  • @bluefeet, I have a few comments: 1) In the data table for the pivots, the join between the `workshift` and `currency_by_workshift` tables is between mismatched keys (a simple type-o). 2) When creating `@colsPivot`, instead of a `distinct` selection, do a `group by` and `order by` so the columns are in the correct order (keep things neat). 3) `rtrim(name)` in the dynamic creation of the column lists will take care of all the extra white space (again, keep things neat). – chezy525 Aug 22 '12 at 15:37
  • thank you, just change ws.ws_id = cbw.ws_id to make table relation correctly and it works like a charm! – Alex Aug 22 '12 at 20:28
  • @Alex happy to help, I didn't see that typo when I initially posted. – Taryn Aug 22 '12 at 20:29
  • @bluefeet, Thanks for this solution, works just fine. I was wondering how can I get total of all fields for each row? – Akbari Mar 12 '16 at 07:11
  • @Akbari You should post a new question with all the details to figure out the total, then any number of users could help you solve it – Taryn Mar 15 '16 at 13:40
9

@bluefeet provided a very good answer utilizing the built in PIVOT functionality. However, I frequently find the PIVOT and UNPIVOT nomenclature confusing and I have yet to encounter a situation where the same results can't be achieved with standard aggregations:

select w.ws_id, w.start_date, w.end_date,
    [100.00] = isnull(sum(case when c.name='100.00' then cw.qty else null end), 0),
    [50.00]  = isnull(sum(case when c.name='50.00'  then cw.qty else null end), 0),
    [20.00]  = isnull(sum(case when c.name='20.00'  then cw.qty else null end), 0),
    [10.00]  = isnull(sum(case when c.name='10.00'  then cw.qty else null end), 0),
    [5.00]   = isnull(sum(case when c.name='5.00'   then cw.qty else null end), 0),
    [1.00]   = isnull(sum(case when c.name='1.00'   then cw.qty else null end), 0)
from workshift w
    join currency_by_workshift cw on w.ws_id=cw.ws_id
    join currency_denom c on cw.cd_id=c.cd_id
group by w.ws_id, w.start_date, w.end_date

If you want to do a dynamic pivot, you only need to build a string of the pivot columns once:

DECLARE @cols AS NVARCHAR(MAX),
    @query  AS NVARCHAR(MAX)

select @cols = 
stuff(( select replace(',[@name] = isnull(sum(case when c.name=''@name'' then cw.qty else null end), 0)'
                       , '@name', rtrim(name))
        from currency_denom
        order by cd_id
        for xml path(''), type
    ).value('.', 'nvarchar(max)')
    ,1,1,'')

select @query = '
select w.ws_id, w.start_date, w.end_date, '+@cols+'
from workshift w
    join currency_by_workshift cw on w.ws_id=cw.ws_id
    join currency_denom c on cw.cd_id=c.cd_id
group by w.ws_id, w.start_date, w.end_date
'

execute(@query)
chezy525
  • 4,025
  • 6
  • 28
  • 41
  • 1
    I quite agree that the PIVOT feature, being relatively new, may indeed seem confusing at first. However, in my opinion, *dynamic* pivoting is easier to implement with PIVOT than with grouping + conditional aggregating. – Andriy M Aug 22 '12 at 15:57
  • @AndriyM, I disagree. I've updated my answer to include what I'd do for a dynamic query, and I find it much cleaner and easier to understand than the equivalent `PIVOT` answer. However, at the end of the day, both work! – chezy525 Aug 22 '12 at 16:23
  • That's not fair, you've *made* it look easy! :) – Andriy M Aug 22 '12 at 17:21