0

I created a MySQL table and that has been filling up in the last couple of weeks. Now I found out that some rows contains duplicates

The fields deviceId, counter1, counter2 together are a unique index. However I cannot ALTER the Table as there are duplicates. So my quesiton is how to remove these duplicates using a MySQL query

Entries have duplicates on counter1 and / or counter2

*deviceId      ts                  counter1    counter2*
device001     2018-06-11 10:18:51     1           2
device001     2018-06-11 10:23:10     1           2
device002     2018-06-12 08:27:33     321         20
device002     2018-06-12 09:49:13     321         20

And from these duplicates 1 should be removed. Any suggestions on how to do this simple? I have tried the following:

Created a copy of the Table

CREATE TABLE sp_data_copy LIKE sp_data_orig;
INSERT sp_data_copy SELECT * FROM sp_data_orig;

Select and delete from copy Table

DELETE FROM sp_data_copy
WHERE ts IN (
    SELECT ts 
    FROM sp_data_orig
    GROUP BY counter1, counter2
    HAVING COUNT(*) > 1);

But obviously it has not worked for me. Hope you have some suggestions. Thanks in advance

Jeroen
  • 33
  • 7
  • Is that all the columns? Is there no primary key, or other column that is unique? – Willem Renzema Jun 16 '18 at 17:46
  • Please do `SHOW CREATE TABLE tablename;` on the table in question. I think you're saying the three columns *should* be a unique index, but you can't alter the table to make that true. Is that right? – O. Jones Jun 16 '18 at 17:48
  • Yes, that is correct. I have NO primary key at this moment and would like to add a unique index to error out duplicates. – Jeroen Jun 16 '18 at 18:09
  • 'CREATE TABLE `sp_data` ( `devId` varchar(20) NOT NULL, `ts` datetime NOT NULL, `counter 1` int(11) DEFAULT NULL, `counter2` tinyint(4) DEFAULT NULL, `status` tinyint(4) DEFAULT NULL, `batteryLevel` int(11) DEFAULT NULL ) ENGINE=InnoDB DEFAULT CHARSET=utf8' – Jeroen Jun 16 '18 at 18:13
  • I read thru the answer you referred - however it does not solve my duplicate issue. Have edited and updated my question above – Jeroen Jun 17 '18 at 13:46

0 Answers0