1

I have a table that acts like a ledger where each row has a column that contains either a positive or negative number, as well as a secondary number that I'll need to get a sum on.

id coins value user_id created
UUID1 25 0 UUID42 datetime
UUID2 -25 0 UUID42 datetime
UUID3 50 599 UUID42 datetime
UUID4 25 0 UUID13 datetime
UUID5 25 399 UUID42 datetime
UUID6 100 1099 UUID42 datetime
UUID7 25 299 UUID42 datetime

Using this table, say that user UUID42 is spending 100 coins. I need to get back records UUID3, UUID5, and UUID6 because the SUM(coins) >= 100, but not record UUID7 since the other 3 already fulfill the criteria. Once I have all of those records back, I'll be able to sum the value and do other processing, but I'll need to individual records returned because I need to know their individual values since it's not a simple SUM(value).

Here's where I'm at currently (as provided by Stu, but slightly modified).

with rt as (
  select *,
    sum(coins) over(partition by user_id order by created) running
  from t
), valid as (
  select *, 
    case when lag(running) over(partition by user_id order by created)<running then 1 else 0 end cons
  from rt
  where user_id = 'UUID42' and running - 100 < 100 
)
select id, coins, value
from valid
where cons = 1

The issue I'm having from this query is that if the user spends 49 coins, then only record UUID3 needs to be returned because the 50 coins in that record cover the 49 needed, but I'm getting both 3 & 5 returned. Then, if the user is only spending 1 coin, again only record UUID3 should be returned, but nothing is returned.

I'm using postgres, but if anyone knows how it can be done in another engine, I'm sure I can get it close.

EDIT: The answer @Stu gave is super close, and gives me the answer to my original question, but it turns out that it only really works for that one case. I've updated my question to better clarify what I'm looking for.

jeremywoertink
  • 2,281
  • 1
  • 23
  • 29
  • 1
    taking the minimum value that is `> 100` from a [cumulative sum](https://stackoverflow.com/questions/2120544/how-to-get-cumulative-sum) based on the value column would solve your issue – Modar Na Sep 08 '21 at 19:02

2 Answers2

2

Here's a possible solution that should work in most databases inc Postgres. It uses a couple of consecutive CTEs to first calculate the running total and then to indicate which values are consecutive, and finally taking the overall sum of qualifying rows.

with rt as (
    select *,
        Sum(coins) over(partition by user_id order by id) running
    from t
), valid as (
    select *, 
      case when lag(running) over(partition by user_id order by id)<running 
        and lag(running) over(partition by user_id order by id)<=100
      then 1 else 0 end cons
    from rt 
)
select Sum(value)
from valid
where cons=1

DB<>Fiddle

Stu
  • 30,392
  • 6
  • 14
  • 33
  • 1
    Awesome! This works. I hadn't seen the `lag` function before. Neat. – jeremywoertink Sep 08 '21 at 19:26
  • Ok, turns out this doesn't quite work for all cases. If that user only spends 49 coins, it should only return record 3, but that query returns 3 and 5. If he spends 1 coin, it should return just record 3, but it returns nothing. – jeremywoertink Sep 08 '21 at 23:25
  • 1
    @jeremywoertink I've revised it slightly by amending the case expression in the *valid* CTE, this might be a better solution. – Stu Sep 08 '21 at 23:49
  • Thanks @stu! That works a lot better. I appreciate the help. – jeremywoertink Sep 09 '21 at 15:06
0

Try this:

select sum(value) from ledger
where user_id=42 and id<=
(

  with temp1 AS(
  select t.id, t.value, sum(t.coins) over(order by t.id)  from ledger t
  where t.user_id=42  
  ),
  temp2 as (
  select t1.id, t1.value from temp1 t1
  where t1.sum-100>0 limit 1
  )
  select id from temp2
 )

The result: enter image description here

Mitko Keckaroski
  • 954
  • 1
  • 8
  • 12
  • Thanks for the suggestion, but I can't rely on the IDs. Technically they will be UUID in the real DB. I just used int to keep it smaller and more concise. – jeremywoertink Sep 08 '21 at 19:20