0

I have 1 unit test which enters the user in system through UI but before that it removes the existing entry for that user

I have 3 sets of query and each set has only 1 record which I want to delete, but in my unit test it fails in executing delete query and returns timeout error

I don't know how can I optimise the query, If someone can help in this

delete from CustomerRoles where RegisteredCustomerId = (select Id from RegisteredCustomers where Email = 'boltestsignupseller@yahoo.com')

 delete from SellerInfos where RegisteredCustomerId = (select Id from RegisteredCustomers where Email = 'boltestsignupseller@yahoo.com')
           
DELETE FROM RegisteredCustomers where Email = 'boltestsignupseller@yahoo.com'

Second and third query almost takes more than 2 minutes and eventually timeout

1 Answers1

1

without knowledge of the database, this is impossible to comment on, but common causes would include:

  • a missing index on the column being used to filter (or an unusable index - perhaps due to varchar vs nvarchar, etc)
  • blocking due to conflicting operations
  • the existence of triggers performing an unbounded amount of additional hidden work

Since the queries appear to be expecting a single RegisteredCustomers record, you can possibly reduce some overhead by capturing the located Id into a local variable at the start, and using that local in all three deletes, but this isn't a magic wand:

declare @id int = (
    select Id from RegisteredCustomers where Email = 'boltestsignupseller@yahoo.com');
delete from CustomerRoles where RegisteredCustomerId = @id;
delete from SellerInfos where RegisteredCustomerId = @id;
delete from RegisteredCustomers where Id = @id;

Most likely, though, you'll need to actually investigate what is happening (look at blocks, look at the query plan, look at the IO stats, look at the indexing etc).

If there are lots of foreign keys on the tables, and those foreign keys are poorly indexed, it can take non-trivial amounts of time to perform deletes simply because it has to do a lot of work to ensure that the deletes don't violate referential integrity. In some cases, it is preferable to perform a logical delete rather than a physical delete, to avoid this work - i.e. have a column that signifies deletion, and just do an update ... set DeletionDate = GETUTCDATE() ... where ... rather than a delete (but: you need to remember for filter by this column in your queries).

Marc Gravell
  • 1,026,079
  • 266
  • 2,566
  • 2,900
  • Most likely, though, you'll need to actually investigate what is happening (look at blocks, look at the query plan, look at the IO stats, look at the indexing etc). - Okay but the only confusing part for me is I am not aware how can I check all this stuff as I have not explored sql much –  Aug 06 '20 at 10:57
  • @PS_1 then you've got a great learning opportunity! for blocking: `sp_who`/`sp_who2` from another session is a good place to start; for the query plan: in SSMS, Query -> Include Actual Execution Plan; for the stats: `set statistics io on;` - now re-run the query and see what you get (and remember to go run `sp_who`/`sp_who2` from another session to see if you catch any blockers. On the stats: "big numbers is bad" (but "big" is subjective) – Marc Gravell Aug 06 '20 at 11:00
  • In "sellerinfo" table total 11 field have entries in colaation - looked in to the structure of table –  Aug 06 '20 at 11:07
  • @PS_1 that doesn't tell you much, though; the stats output is going to be the most important thing in identifying the cause of the – Marc Gravell Aug 06 '20 at 11:56