I developed a web application for running email campaigns. A cron is running to send multiple emails (upto 100 in single request) per minute.
SELECT id,email,email_text FROM recipients
WHERE sent_status=0 LIMIT 100
This script takes approx 70-100 seconds to send all the email using php. After sending each email, I update the sent_status
=1.
Now the problem is that dues to shared hosting the script is not able to process more than 50-60 records in 60 seconds, then another request started which also select those 40 records that are still processing with first request not updated yet. Due to this some recipients receives duplicate emails.
Can this prevent by using Locking
or any other solution ?
UPDATE
However my question is very similar with the linked duplicate question, except that I am actually SELECTing data from multiple tables, using GROUP BY
and using ORDER BY
clause on multiple columns including RAND()
.
My actual query something like this
SELECT s.sender_name,
s.sender_email,
r.recipient_name,
r.email,
c.campaign_id,
c.email_text
FROM users s, recipients r, campaigns c
WHERE c.sender_id=s.sender_id
AND c.recipient_id=r.recipient_id
AND sent_status=0
GROUP BY c.sender_id, r.recipient_id
ORDER BY DATE(previous_sent_time), RAND()
LIMIT 100
Thanks