Through this code I get very long performance (over 24h) working on large lists (~150M list elements inside, with 4 string elements each). I need to delete around 66M tuples from it:
def erase_random_elements(elements_list, iterable_random_numbers):
for i in sorted(iterable_random_numbers, reverse = True):
elements_list.pop(i)
return elements_list
It seems I've got enough RAM for it, so I don't need to chunk the list. How can I do it faster?