0

As first I have to say i have walked through a lot of questions, but I can't find the right answer to my question. I see some question of people asking the same. Top questions there state something like this:

DELETE FROM MyTable
LEFT OUTER JOIN (
    SELECT MIN(RowId) as RowId, Col1, Col2, Col3 
FROM MyTable 
    GROUP BY Col1, Col2, Col3
) 
as KeepRows ON
MyTable.RowId = KeepRows.RowId
    WHERE
KeepRows.RowId IS NULL

The problem with this is, you select a min or max ID, And everything above or under that, that has duplicate values will get deleted.

Now i have a database with about 50k rows. With a couple of hundred duplicate rows. Is there any other way to delete duplicate rows? Except for 1.

In the end i want to delete all duplicate rows from my database. Single handedly 1 by 1 is not an option, because we might be talking about a thousand rows. I need a sql query (if possible) which selects all duplicates. And deletes all duplicates except for 1. (a duplicate can be the same description of a row. But also the same name of the row is seen as a duplicate)

This statement does select all the doubles from my database. Now i need something like this that deletes those rows.

SELECT
a.id, a.word, a.description,
b.id, b.word, b.description 
FROM wordslist AS a
INNER JOIN wordslist AS b ON a.word = b.word
AND a.id != b.id;
Jorn Barkhof
  • 264
  • 1
  • 16
  • Copy rows to a new table where the key will reject those duplicate rows. – nicomp Mar 14 '19 at 09:13
  • I don't see why what you state is a problem actually is a problem. Isn't it just what you want to do? Also, can you be more precise? You look for another solution, but why? What's your criteria for wanting something else? – Kaddath Mar 14 '19 at 09:16
  • 1
    `DELETE FROM wordslist WHERE id in(SELECT b.id FROM wordslist a INNER JOIN wordslist b ON a.word = b.word AND b.id > a.id)` or `DELETE FROM wordslist WHERE id NOT IN(SELECT min(id) FROM wordslist GROUP BY word)` .. run with caution (on a backup table or in a transaction) – Caius Jard Mar 14 '19 at 09:26
  • Thanks alot @CaiusJard This has solved my case for now – Jorn Barkhof Mar 14 '19 at 09:45

0 Answers0