I have a postgresql db table with 2+ million records. I need to narrow these down to a list of 100k.
I have a list of values to filter by:
codes = ['17389', '77289', ...]
And I have a working SQLalchemy statement:
stmt = BarCodes.__table__.delete().where(~BarCodes.code.in_(codes))
The only issue is that this statement takes a long time to run. I'm guessing that it is going through the codes each time to check for the value. Is there a more efficient way to do this?