0

I got a task to delete 9 million record from a table

so iam planning to do via SQL Agent job, i have prepared script to delete as like 50000 records from one delete statement

So now do i need to create single jobs with many steps or i can create various jobs new jobs without schedules?

Which will be more resource efficient?

Note:Log backup is configured for every 20 mins

Thanks, Vishvas.K

  • 1
    Possible duplicate of [Deleting 1 millions rows in SQL Server](https://stackoverflow.com/questions/24785439/deleting-1-millions-rows-in-sql-server) – Dave C Dec 21 '18 at 16:24
  • There are plenty of answers on deleting large volumes. Any statements that hit > 5K rows will cause a table lock, so if you schedule multiple jobs to purge 50K each, you'll just wind up locking up your table. Batched deletion is the best approach. – Dave C Dec 21 '18 at 16:26

1 Answers1

0

How about below quick and easy solution -

  WHILE 1 = 1
  BEGIN
     DELETE TOP(5000)
     FROM Table_Name
     WHERE <condition_here>
     IF @@ROWCOUNT < 5000 BREAK;
  END
Andrey Nikolov
  • 12,967
  • 3
  • 20
  • 32
Gaurav
  • 623
  • 5
  • 11