There are two separate processes running in Python script. Both interact with a global variable POST_QUEUE = []
- Process 1 (
P1
) adds items toPOST_QUEUE
every 60 seconds. This can be anywhere from 0 to 50 items at a time. - Process 2 (
P2
) iterates overPOST_QUEUE
via a for-loop at set intervals and performs an operation on the list items one at a time. After performing said operation, the process removes the item from the list.
Below is a generalized version of P2
:
def Process_2():
for post in POST_QUEUE:
if perform_operation(post):
Print("Success!")
else:
Print("Failure.")
POST_QUEUE.remove(post)
Understandably, I've run into an issue where when removing items from a list that a for-loop is iterating over, it screws up the indexing and terminates the loop earlier than expected (i.e., before it performs the necessary operation on each post and removes it from POST_QUEUE
).
Is there a better way to do this than just creating a copy of POST_QUEUE
and having P2
iterate over that while removing items from the original POST_QUEUE
object? For example:
def Process_2():
POST_QUEUE_COPY = POST_QUEUE[:]
for post in POST_QUEUE_COPY:
if perform_operation(post):
Print("Success!")
else:
Print("Failure.")
POST_QUEUE.remove(post)