0

Oracle 19c

We have a requirement where we want to provide some report data from Event table, Event and Item table has schema like this

    CREATE TABLE EVENT
       (    
        "ID" NUMBER(19,0) NOT NULL ENABLE, 
        "CREATED" TIMESTAMP (6) NOT NULL, 
        "CUSTOMER_ID" VARCHAR2(255 CHAR) NOT NULL, 
        "CONF_ID" VARCHAR2(255 CHAR), 
        "STATE" VARCHAR2(255 CHAR) NOT NULL,
        "ITEM_ID" VARCHAR2(255 CHAR) NOT NULL
        ...
       )

    CREATE TABLE ITEM
    (   
        "ID" NUMBER(19,0) NOT NULL ENABLE,
        "NAME" VARCHAR2(255 CHAR) NOT NULL
        ....

        primary key (ID)
    )

  alter table EVENT
   add constraint EVENT_FK_ITEM_BID
      foreign key (ITEM_ID)
      references ITEM;

where events are created with different states as per real time occurrance. Events are bound to Item Table with Item_id.

What we want to achieve select count of Event States (only consider the latest state per item_id), grouped by CUSTOMER_ID and CONF_ID.

Event table could have more than 2 million rows.

result should look like

CUSTOMER_ID CONF_ID ACIVATED DEACTIVATED    SUSPENDED
----------  ------- -------- -------------  ---------
1             2      50000    20000          5000
1             1      70000    30000          2000
2             1      80000    10000          10000
2             2      50000    20000          5000 

Could you please guide us building an efficient query?

user3027786
  • 185
  • 2
  • 14

1 Answers1

0

You can find the latest rows using the RANK analytic function (or the ROW_NUMBER analytic function if there will only ever be one latest row for each customer_id, conf_id, item_id tuple) and then PIVOT:

SELECT *
FROM   (
  SELECT customer_id,
         conf_id,
         state
  FROM   (
    SELECT customer_id,
           conf_id,
           state,
           RANK() OVER (PARTITION BY customer_id, conf_id, item_id ORDER BY created DESC) AS rnk
    FROM   event
    WHERE  state IN ('ACTIVATED', 'SUSPENDED', 'DEACTIVATED')
  )
  WHERE  rnk = 1
)
PIVOT (
  COUNT(*)
  FOR state IN (
    'ACTIVATED' AS activated,
    'SUSPENDED' AS suspended,
    'DEACTIVATED' AS deactivated
  )
);

fiddle

MT0
  • 143,790
  • 11
  • 59
  • 117
  • is join between Item.id and Event.item_id also considered here? if we want to get only from last month, we need to add 'created' between clause in 1st where clause, right? – user3027786 Nov 22 '22 at 10:51
  • @user3027786 There is no `item` table in your question so, no, a `JOIN` between `Item.Id` and `Event.item_id` is not considered because you have never told us there would be such a join nor that it needed to be considered. – MT0 Nov 22 '22 at 10:54
  • @user3027786 Yes, if you want to only get rows from the last month then you can add the filter to the inner query. – MT0 Nov 22 '22 at 10:55
  • This is my fault :( i wrote only Events are bound to Item Table with Item_id – user3027786 Nov 22 '22 at 10:55
  • Presumably, any join would not need to be considered if you have a referential constraint on the `Event` table and then there cannot be any `Event.item_id`s that are not in the `Item` table. – MT0 Nov 22 '22 at 10:57
  • Can we take into account Min(conf_id) as well? For an Item: An Event created in past would definitly have conf_id, but later stage an Event registered for Item would have no conf_id (it is possible that conf was deleted). In this case we will see two tuples for same Item Events with your query, one with conf_id and the other with conf_id Null. – user3027786 Nov 23 '22 at 09:05
  • @user3027786 Sounds like you may want to move `conf_id` from the `PARTITION BY` clause to the `ORDER BY` clause. However, I'm not sure if that is correct as the expected result in your question has multiple `conf_id`s for each `customer_id`. It may be better to [ask a new question](https://stackoverflow.com/questions/ask) where you can explain the problem in detail. – MT0 Nov 23 '22 at 09:39
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/249829/discussion-between-user3027786-and-mt0). – user3027786 Nov 23 '22 at 10:02