-2

I have a table like this:

PROFILE_ID X_START_DATE X_END_DATE   FORMER_EMPLOYER    NEW_EMPLOYER  START_DATE
1          2015-07-20   2016-07-20   GOOGLE             BURGER KING   2017-01-01
1          2003-10-25   2009-01-14   FACEBOOK           BURGER KING   2017-01-01
2          2007-10-04   2008-05-05   MICHAELS           KFC           2017-01-01
2          2008-05-06   2009-05-05   GOOGLE             KFC           2017-01-01
2          2009-05-06   2010-05-05   FACEBOOK           KFC           2017-01-01
3          2007-10-04   2008-05-05   MCDONALDS          BURGER KING   2017-01-01

What I want:

For each PROFILE_ID, I need the row, that contains the latest X_END_DATE. For PROFILE_ID 1 I need row 1 and so on.

When I do:

Select profile_id, max(end_date)
group by 1;

I actually get what I want, but not all columns that I need. By taking more columns, I need to use them in my "groupby" statement, which is not what I want.

Thanks!!

Johnny Banana
  • 123
  • 1
  • 9

3 Answers3

1

Another way using IN and a subquery

select * 
from yourtable 
where (profile_id, end_date) IN
(Select profile_id, max(end_date) as end_date
from yourtable
group by profile_id);

If you prefer joins

select a.* 
from yourtable a 
INNER JOIN (Select profile_id, max(end_date) as end_date
            from yourtable
            group by profile_id) b 
ON a.profile_id = b.profile_id and a.end_date b.end_date;
Srini V
  • 11,045
  • 14
  • 66
  • 89
  • I prefer to avoid using IN for lists longer than I'd be prepared to type by hand. This may also reasonably require two table scans, one for the group and another for the IN filter.. Using analytics typically requires only one scan, which can have a huge performance improvement on large tables – Caius Jard Nov 21 '17 at 10:18
  • As I said, it's just another way, since you have answered one way I have included another possibility. Even I would write one with a JOIN – Srini V Nov 21 '17 at 10:20
  • `GROUP BY profile_id` makes the query ANSI/ISO SQL compliant! – jarlh Nov 21 '17 at 10:21
  • @jarlh Amended. – Srini V Nov 21 '17 at 10:23
  • The OP states they need 1 row per group - if there are multiple rows with the same maximum end date then this will return multiple rows per group. – MT0 Nov 21 '17 at 10:36
0

In oracle, this is easiest done using what's called an analytical function:

  SELECT * FROM
  (
   SELECT
    t.*,
    ROW_NUMBER() OVER(PARTITION BY t.profile_id ORDER BY t.end_date DESC) as rown
   FROM
    yourtable t
  ) a
  WHERE rown = 1

THe way it works is, the row_number function assigns an incrementing number to a block of rows in a partition - all rows in the same partition (same profile id) are considered for numbering. The order of the rows is specified by end_date descending (most recent first. This way we know, for every distinct profile_id value, the one with the most recent end_date will always be numbered with a 1.. Then we just select those

Caius Jard
  • 72,509
  • 5
  • 49
  • 80
0

If you are using group by then you have to specify aggregate functions on all other columns, to ensure which row's which data you want to populate.

Remember group by will give you single record for applied column.

Select profile_id, max(end_date), 
max(X_END_DATE),   
max(FORMER_EMPLOYER),    
max(NEW_EMPLOYER), 
max(START_DATE)
group by profile_id;

No need to add all unnecessary columns in group by just add Aggregate functions on it.

Deepesh kumar Gupta
  • 884
  • 2
  • 11
  • 29
  • 1
    `GROUP BY 1` will not group by the first column but by the number literal `1`. Also this will not get the maximum from the same row as the maximum start date could be in a different row to the maximum end date and same for the other columns. – MT0 Nov 21 '17 at 10:33
  • MT0, thanks for the suggestion. changed it to column name. – Deepesh kumar Gupta Nov 21 '17 at 10:42
  • If you are going to use `MAX` then you want to use `KEEP ( DENSE_RANK LAST ORDER BY X_END_DATE )` on the non-end-date columns so you return values corresponding to the latest end-date (see https://stackoverflow.com/a/121661/1509264) – MT0 Nov 21 '17 at 11:04