You can make an additional table with a column for your target IDs, and perhaps also a job ID. Then you can insert your IDs into this table. You'll end up with around 30k rows in the new table: one for each ID to delete. If your table's IDs are not the primary key, this is also the time to make that translation. You can let this part of the operation be a little slow if you need to, because it won't interfere with other parts of your database in terms of locks, or you can use a bulk insert technique, which you already seem comfortable with.
Once this table is populated, you can write a DELETE statement that includes a JOIN to this new table. In my experience, this will be the fastest way to complete your delete operation.
This doesn't sound like your situation, but when you really need it to run faster you can also try running the job during a period of lower load, or if you have a maintenance window you can switch to bulk-logged for a while, though this is an option of last resort.
After completing the query, truncate the extra table or delete rows with your job ID.
If this still isn't fast enough, you can improve things further by implementing logical deletes in your application. A logical delete is when you add a column in the table with a name like IsDeleted
or DeletedDate
. Then, to delete a row you only need to update the value of that column. You can also have a separate process that runs in the background to clean up these records after a certain amount of time if you want.
Logical deletes can be much faster for processing larger batches. However, it can mean re-writing significant parts of your application to understand the difference between a deleted record and an active record.