1

Disclaimer: I have searched for similar questions but I haven't found a clear answer to my issue.

I have a table like this:

id | catid | views | text
1    100      2000   "sometext"
2    200      2000   "sometext"
3    200      3000   "longertext"

For each catid (that in this case are just 2: 100 and 200) i need to get the record with the most views and the longer texts... In this case the result would be:

id | catid | views | text
1    100      2000   "sometext"
3    200      3000   "longertext"

With priority on the number of views. I have tried some queries with inner joins but none seems clear and working...

Any thoughts ?

GMB
  • 216,147
  • 25
  • 84
  • 135
giò
  • 3,402
  • 7
  • 27
  • 54
  • @Strawberry: The answer below is much better than the answer found in your linked question ! – giò Dec 26 '19 at 22:15
  • I'm sure they're essentially the same – Strawberry Dec 26 '19 at 22:19
  • Here's a nested `JOIN` solution to give the result you want: https://www.db-fiddle.com/f/rTwJKMNjoG5KUe8pSU36Su/0 – Nick Dec 26 '19 at 22:22
  • @Nick: that's much more complex than the simple GMB query. Regarding the efficiency: in this case future maintanability is preferred over efficiency since this is a one shot query – giò Dec 26 '19 at 22:25
  • @giò it looks more complex, but there are a total of 3 selects in that query, whereas for GMBs answer there is 1 select for *every* row in the table. – Nick Dec 26 '19 at 22:27

1 Answers1

1

A method uses a correlated subquery with order by and limit for filtering. Assuming that id is primary or unique key:

select t.*
from mytable t
where t.id = (
    select id 
    from mytable t1
    where t1.catid = t.catid
    order by t1.views desc, char_length(t1.text) desc
    limit 1
)
GMB
  • 216,147
  • 25
  • 84
  • 135
  • 1
    Spectacular query: simple and effective... I haven't found this query in the other related questions. I hope this aswer will popup earlier in the google searches – giò Dec 26 '19 at 22:13
  • @giò because the subquery has to be executed for *every* row I think you'll find this is a *lot* less efficient than the queries in the linked question when your table gets large. – Nick Dec 26 '19 at 22:19