0

i want to delete duplicate rows from my table on the basis of category ID, but don't want to delete all, i want to left one rows if there are more than one row with the same category ID.

this is my query i am making i need to change it.

delete from twinhead_tblcategory  where categoryid in (select categoryid from twinhead_tblcategory group by categoryid having count(categoryid) > 1 )
NoviceToDotNet
  • 10,387
  • 36
  • 112
  • 166

7 Answers7

4

Do a select distinct into a new table, delete the old one and rename the new one into old table name.

dance2die
  • 35,807
  • 39
  • 131
  • 194
4

For SQL Server you can do it:

WITH MyTableCTE (CategoryId, RowNumber)
AS
(
    SELECT CategoryId, ROW_NUMBER() OVER (ORDER BY CategoryId) AS 'RowNumber'
    FROM MyTable

)

Delete From MyTableCTE Where RowNumber > 1
RobertoBr
  • 1,781
  • 12
  • 22
  • please explain me this WITH MyTableCTE (CategoryId, RowNumber) and how the two passing parameter help here?. – NoviceToDotNet Feb 24 '11 at 14:57
  • It is called CTE (commom table expressions) Have a look at: http://msdn.microsoft.com/en-us/library/ms190766.aspx there you may get much more info! – RobertoBr Feb 24 '11 at 16:30
3

If your rows have a distinct id column, then this should work:

DELETE t1 FROM your_table t1, your_table t2 
WHERE t1.column1 = t2.column1 AND t1.column2 = t2.column2
AND ... /* check equality of all relevant columns */
AND t1.id < t2.id
Alessandro
  • 1,336
  • 8
  • 15
1

Check here for sql server - http://support.microsoft.com/kb/139444 - that should get you started.

czuroski
  • 4,316
  • 9
  • 50
  • 86
1

This is probably heavy-handed but perhaps you could select distinct * into a temp table, then truncate the table, then insert into the table the contents of the temp table. Foreign key constraints may prevent this, though.

SquidScareMe
  • 3,108
  • 2
  • 24
  • 37
1

For SqlServer, you could use a cursor to loop through all items, ordered by that categoryID.

Is the current ID the same as the previous one? Then delete it, see example C of this article. Else remember the ID for the next round.

Hans Kesting
  • 38,117
  • 9
  • 79
  • 111
0

You have several way for delete duplicate rows.

for my solutions , first consider this table for example

CREATE TABLE #Employee
(
ID          INT,
FIRST_NAME  NVARCHAR(100),
LAST_NAME   NVARCHAR(300)
)

INSERT INTO #Employee VALUES ( 1, 'Vahid', 'Nasiri' );
INSERT INTO #Employee VALUES ( 2, 'name1', 'lname1' );
INSERT INTO #Employee VALUES ( 3, 'name2', 'lname2' );
INSERT INTO #Employee VALUES ( 2, 'name1', 'lname1' );
INSERT INTO #Employee VALUES ( 3, 'name2', 'lname2' );
INSERT INTO #Employee VALUES ( 4, 'name3', 'lname3' );

First solution : Use another table for duplicate rows.

SELECT DISTINCT *
FROM   #Employee 

SELECT * INTO #DuplicateEmployee
FROM   #Employee

INSERT #DuplicateEmployee
SELECT DISTINCT *
FROM   #Employee

BEGIN TRAN 
DELETE #Employee
INSERT #Employee
SELECT *
FROM   #DuplicateEmployee

COMMIT TRAN 

DROP TABLE #DuplicateEmployee 

SELECT DISTINCT *
FROM   #Employee 

Second solution :

SELECT DISTINCT * FROM #Employee

SELECT * INTO #DuplicateEmployee FROM #Employee

INSERT #DuplicateEmployee
SELECT ID,
    FIRST_NAME,
    LAST_NAME
FROM   #Employee
GROUP BY
    ID,FIRST_NAME,LAST_NAME
HAVING COUNT(*) > 1

BEGIN TRAN
DELETE #Employee
FROM   #DuplicateEmployee
WHERE  #Employee.ID = #DuplicateEmployee.ID
AND #Employee.FIRST_NAME = #DuplicateEmployee.FIRST_NAME
AND #Employee.LAST_NAME = #DuplicateEmployee.LAST_NAME

INSERT #Employee
SELECT *
FROM   #DuplicateEmployee

COMMIT TRAN
DROP TABLE #DuplicateEmployee

SELECT DISTINCT * FROM   #Employee

teared solution : use rowcount

SELECT DISTINCT *
FROM   #Employee

SET ROWCOUNT 1
SELECT 1
WHILE @@rowcount > 0
   DELETE #Employee
   WHERE  1 < (
          SELECT COUNT(*)
          FROM   #Employee a2
          WHERE  #Employee.ID = a2.ID
                 AND #Employee.FIRST_NAME = a2.FIRST_NAME
                 AND #Employee.LAST_NAME = a2.LAST_NAME
      )

SET ROWCOUNT 0

SELECT DISTINCT *
FROM   #Employee

Fourth solution : use Analytical Functions

SELECT DISTINCT *
FROM   #Employee;

WITH #DeleteEmployee AS (
                     SELECT ROW_NUMBER()
                            OVER(PARTITION BY ID, First_Name, Last_Name ORDER BY ID) AS
                            RNUM
                     FROM   #Employee
                 )

DELETE
FROM   #DeleteEmployee
WHERE  RNUM > 1

SELECT DISTINCT *
FROM   #Employee

Fifth solution : Use identity field

SELECT DISTINCT *
FROM   #Employee;

ALTER TABLE #Employee ADD UNIQ_ID INT IDENTITY(1, 1)

DELETE
FROM   #Employee
WHERE  UNIQ_ID < (
    SELECT MAX(UNIQ_ID)
    FROM   #Employee a2
    WHERE  #Employee.ID = a2.ID
           AND #Employee.FIRST_NAME = a2.FIRST_NAME
           AND #Employee.LAST_NAME = a2.LAST_NAME
)

ALTER TABLE #Employee DROP COLUMN UNIQ_ID

SELECT DISTINCT *
FROM   #Employee

and end of all solution use this command

DROP TABLE #Employee

Source of my answer is this site

Ardalan Shahgholi
  • 11,967
  • 21
  • 108
  • 144
  • Hello, please don't post duplicate answers across the site as this creates noise. Instead, if you find duplicate posts, flag them as duplicates. You only need 15 reputation to flag posts. See [duplicate answers](http://meta.stackexchange.com/questions/178223/why-were-all-my-copy-pasted-answers-to-duplicate-questions-deleted) for details. – jamesmortensen Nov 07 '13 at 02:19
  • Hello @loup, your answers were automatically flagged by the system and deleted by a moderator. The system is built to look for suspicious or problematic activity and bring that to the community's attention. If you edit your post to fix the link to change it from "this site" to actually say the name of the site so there's proper attribution, I can remove that downvote. Good luck! – jamesmortensen Nov 07 '13 at 06:35
  • If you find a duplicate, use the "flag" link under the question to mark it as a duplicate of another question. Please see the [flag posts privilege](http://stackoverflow.com/help/privileges/flag-posts) page for more details. Good luck! – jamesmortensen Nov 07 '13 at 06:47