1

I want to delete some data from my table, so write a query to delete it. So first I selected the data what I wanted to delete using this query and it shows me the correct data

select * from PATS.Discipline where Code like '%DHA-DIS%'

then I write the delete using same where condition but it not working

DELETE FROM PATS.Discipline WHERE Code LIKE '%DHA-DIS%'

here I'm sharing some screenshots while I'm executing the query enter image description here

here is some sample data enter image description here

this is the last execution result and I have waited for 1 minute and last stop the execution enter image description here

added table structure enter image description here

Updated on 15-OCT-2019

I tried the same scenario today, I imported 5000 records into the table and try to delete this 5000 records using the same query, and surprisingly it is working. I tried both cases that 1. data don't have any foreign-key exists, it's working fine. Here is the screenshot enter image description here

  1. data contain foreign-key will show an error. enter image description here

both cases are working now, I don't know what happened on that day to SQL server

Dhanil Dinesan
  • 575
  • 9
  • 27

11 Answers11

3

It should be working fine, i dont see anything wrong with the query, try to write fresh query in the new query window to make sure query is clean

For me it works fine and removes test data from table

select * from Discipline
select * from Discipline where Code like '%DHA-DIS%'

enter image description here

DELETE FROM Discipline WHERE Code LIKE '%DHA-DIS%'

(2 row(s) affected)


select * from Discipline where Code like '%DHA-DIS%'

enter image description here

You can see above query works fine, It delete the 2 rows that matched query.

Learning
  • 19,469
  • 39
  • 180
  • 373
  • Comments are not for extended discussion; this conversation has been [moved to chat](https://chat.stackoverflow.com/rooms/196191/discussion-on-answer-by-learning-why-sql-delete-query-is-not-working-while-using). – Samuel Liew Jul 09 '19 at 09:27
2

It might be possible that there is a lock on table Discipline by another process
You can find the locks on your table by running this script

declare @lock table (spid int, dbid int, objId int, indId int, Type char(4), resource nchar(32), Mode char(8), status char(6))
declare @who table (spid int, ecid int, status char(30), loginname char(128), hostname char(128), blk char(5), dbname char(128), cmd char(16), request_id INT)
declare @LockSummary table (loginname varchar(28), DB varchar(128), object varchar(30), ToLevel varchar(20), How_Many int, Xclusive_lock_for_command char(16), spid int, hostname char(128))

insert into @lock exec sp_lock
insert into @who exec sp_who

insert into @LockSummary
select loginname, 
       db_name(dbid) as DB,
       object_name(objID) as object,
       max(mode) as [ToLevel],
       Count(*) as [How Many],
       Max(Case When mode= 'X' Then cmd Else null End) as [Xclusive lock for command],
       l.spid, 
       hostname
from   @lock l 
  join @who w on l.spid = w.spid
where  dbID != db_id('tempdb') 
and    l.status = 'GRANT'
group by dbID, objID, l.spid, hostname, loginname

select * 
from   @LockSummary 
where  object like '%Discipline%'
order by [ToLevel] Desc, [How_Many] Desc, loginname, DB, object
GuidoG
  • 11,359
  • 6
  • 44
  • 79
2

I think you are more concern about the execution time here as it's taking long time so you cancelled the execution. I faced same issue in past my solution is restart SQL Management Studio and try the same query, it will work fine.

Amira Bedhiafi
  • 8,088
  • 6
  • 24
  • 60
ganesh pandey
  • 120
  • 1
  • 6
0

1) You have 5000 rows to delete but that is not necessarily the entire table. How many rows in the table?

2) Your LIKE statement will require a table scan to satisfy. If there are loads of rows or large columns, that will take a long time.

3) Post the query plan and the stats on the tables it uses.

4) If you cancel a delete then the DB will roll back. That is why you are seeing it do nothing.

LoztInSpace
  • 5,584
  • 1
  • 15
  • 27
  • even with a table scan it should not take that long because there are only 5000 rows in the table. My guess is that there is a lock on the table by another process – GuidoG Jul 09 '19 at 07:27
0

Have you tried the same query in edit mode. You may do this

  1. Right click the Pats.Discipline table.
  2. Choose edit top 200...rows
  3. Press Ctrl + 3 to edit the query.
  4. Replace the query with select * from Discipline where Code like '%DHA-DIS%'
  5. Click the first top cell like in the image below to select all rows
  6. Hit delete button

If the execution is taking too long try to change your query to something like this

select top 1000 * from Discipline where Code like '%DHA-DIS%'

repeat the step if this succeeds

enter image description here

Community
  • 1
  • 1
Gnyasha
  • 658
  • 11
  • 19
0

Two things could be slowing your delete

  1. You have cascade delete on the foreign keys referencing the PK in your Pats.Discipline table. You can check if you have any keys with cascade delete turned on by running the query below.
    select * from sys.foreign_keys where delete_referential_action > 0
  1. The primary key of your Pats.Discipline table is used as a foreign key in other tables with very large number of rows. Deleting the records with these fields in the Discipline table will trigger a referential integrity check on all tables where this key is referenced and depending on some factors this could be a lengthy process. If you know that the referential integrity is not violated by this deletion, you can drop the foreign keys, then run your delete command, then re-create the foreign keys.
Dmitri M
  • 517
  • 6
  • 13
0

Note : Take Back UP

If possible Alter PATSDiscipline to this,

ID int PK
Code varchar(20)
Name varchar(50)
CreatedBY varchar(50) or even better INT
CreatedDate Datetime2(0)
UpdatedBY varchar(50) or even better INT
UpdatedDate Datetime2(0)

Or Alter whatever Column is possible .

CreatedBY can be converted INT by first manually joining with concern table. Then correcting code.

I am not discussing this point, Code LIKE '%DHA-DIS%' is NON SARGable. Or WHY you should not use . and whether code column should be NON Clustered Index or not.

Altering table design will help you in every aspect be it Select query or DML operation and forever.

Alternatively you can do this

Put the Select Resultset in Temp table then Join and delete

create table #temp (id uniqueidentifier not null primary key)

insert into #temp 
select id from PATS.Discipline where Code like '%DHA-DIS%'

delete from PATS.Discipline
where exists(select 1 from #temp where #temp.id=PATS.Discipline.id )

You can take both step

If Delete data is in million then you can use paging,

DECLARE @TopSize INT = 10000
DECLARE @BatchSize INT = 10000
DECLARE @MaxLimit INT = 1
DECLARE @RowCount INT = 0

BEGIN TRY
    WHILE (@TopSize <= @MaxLimit)
    BEGIN


    delete TOP (@TopSize) from PATS.Discipline
    where exists(select 1 from #temp where #temp.id=PATS.Discipline.id )


        SET @RowCount = @@RowCount

        --PRINT @TopSize
        IF (
                @RowCount = 0
                OR @RowCount IS NULL
                )
            BREAK;
        ELSE
            SET @TopSize = @TopSize + @BatchSize
    END
END TRY

BEGIN CATCH
    --catch error
END CATCH
KumarHarsh
  • 5,046
  • 1
  • 18
  • 22
0

This answer is based on my personal experience.

You are storing uniqueidentifier as Id of your table and make it primary clustered key. Deleting such number of rows are taking time due to re-indexing your table. So, I would suggest -

  1. Drop your primary key index
  2. Execute your delete statement
  3. Re-add your primary key index

It helped me in a similar case. Why? Because in this strategy, indexing of that table is being manipulated only twice. While deleting the re-indexing of the table does not occur. For that reason, you will see significant improvement in performing deletion.

So, your query should be like this-

ALTER TABLE PATS.Discipline DROP CONSTRAINT [PK_Discipline]
GO
DELETE FROM PATS.Discipline WHERE Code LIKE '%DHA-DIS%'
GO
ALTER TABLE PATS.Discipline DROP CONSTRAINT [PK_Discipline] PRIMARY KEY CLUSTERED
GO

Note: This deletion strategy should only be applied to this kind of cases where the table has numerous indexes or primary key is uniqueidentifier and you are trying to delete more rows from that table. But in normal scenario, the same old delete query should be applied.

Adnan Sharif
  • 919
  • 7
  • 17
0

It is probably not a problem with the LIKE keyword. The SELECT appears to return a lot of matching rows - I presume relatively quickly(?) - so the question is "What is slow about (or what prevents) the DELETE?" Try

DELETE TOP 1 FROM PATS.Discipline WHERE Code LIKE '%DHA-DIS%'

and if this works try TOP 10/100...

An educated guess says that it's slow because of reindexing and/or cross-table relationships (foreign key constraints) and/or just building up a very large transaction.

How to speed it up? There are a lot of good suggestions amongst the answers already posted and I would also suggest having a look at this: How to efficiently delete rows while NOT using Truncate Table in a 500,000+ rows table

AlanK
  • 1,827
  • 13
  • 16
  • it's working like this, the issue is it's not deleting the bulk data – Dhanil Dinesan Oct 14 '19 at 17:02
  • Did you try deleting top 100? If it works then the discussion in the link I posted explains why, to delete the bulk, you should do it in smaller "chunks". There are examples of how to do so neatly. The DELETE is executed transactionally, i.e. the DB writes its "intentions" (everything it is about to do) in the transaction log file (so that it can undo if something goes wrong), before it commits (makes the result final). In the absence of other evidence (for example of locking), this is probably why your DELETE runs for a very long time. Google "cost of sql delete" for more. – AlanK Oct 14 '19 at 17:26
  • You're welcome. If your question has been answered, please mark the best one. If not, but you have found the solution yourself, please post it. – AlanK Oct 15 '19 at 06:29
0

Other all I don't see any issue with the code you where originally using although using a like statement on a delete query will have performance impacts. If I know I am only deleting a few thousand rows I take the CTE approach and join this back onto the source table:

;With DeleteCTE as 
(

select 
    [ID],
    Code
from Discipline 
    where Code like '%DHA-DIS%'
)
Delete
from Discipline D 
Inner Join
    DeleteCTE DC 
        on DC.ID = D.ID
Tom
  • 12,928
  • 2
  • 15
  • 31
0

There is no problem with the query you used.

There may be several reasons. The network cable might've been disconnected, The database might've been offline in the middle of the execution, There might've been some power failure, Or Access to the table locked because of another transaction etc

When your query failed it took even more than 1 minute sometimes right ? But when it was successful the execution was very fast. There must've been another access to that table through the network at that time when the query failed.

When dealing with a large no of data like 5000 you can have a stored procedures with transactions blocks added.