I suggest turning on the sonar. I have 3 event links hanging off my profile page. So I created a few helper tables (that can also be seen in those links) to assist is turning on the sonar to see what is up in your events. Note you can expand on it for performance tracking as I did in those links.
Remember that Events succeed or fail (in your mind) based on the data and they do so silently. But tracking what is going on, you can vastly increase you happiness level when developing in them.
Event:
DROP EVENT IF EXISTS move_to_archive_category;
DELIMITER $$
CREATE EVENT move_to_archive_category
ON SCHEDULE EVERY 1 MINUTE STARTS '2015-09-01 00:00:00'
ON COMPLETION PRESERVE
DO
BEGIN
DECLARE incarnationId int default 0;
DECLARE evtAlias varchar(20);
SET evtAlias:='move_2_archive';
INSERT incarnations(usedBy) VALUES (evtAlias);
SELECT LAST_INSERT_ID() INTO incarnationId;
INSERT EvtsLog(incarnationId,evtName,step,debugMsg,dtWhenLogged)
SELECT incarnationId,evtAlias,1,'Event Fired, begin looking',now();
INSERT INTO `oc_product_to_category` (product_id, category_id)
SELECT product_id, 68 as category_id
FROM oc_product p WHERE p.event_end < NOW() AND p.event_end <> '0000-00-00';
-- perhaps collect metrics for above insert and use that in debugMsg below
-- perhaps with a CONCAT into a msg
INSERT EvtsLog(incarnationId,evtName,step,debugMsg,dtWhenLogged)
SELECT incarnationId,evtAlias,10,'INSERT finished',now();
-- pretend there is more stuff
-- ...
-- ...
INSERT EvtsLog(incarnationId,evtName,step,debugMsg,dtWhenLogged)
SELECT incarnationId,evtAlias,99,'Event Finished',now();
END $$
DELIMITER ;
Tables:
create table oc_product_to_category
( product_id INT not null,
category_id INT not null
);
create table oc_product
( product_id INT not null,
event_end datetime not null
);
drop table if exists incarnations;
create table incarnations
( -- NoteA
-- a control table used to feed incarnation id's to events that want performance reporting.
-- The long an short of it, insert a row here merely to acquire an auto_increment id
id int auto_increment primary key,
usedBy varchar(50) not null
-- could use other columns perhaps, like how used or a datetime
-- but mainly it feeds back an auto_increment
-- the usedBy column is like a dummy column just to be fed a last_insert_id()
-- but the insert has to insert something, so we use usedBy
);
drop table if exists EvtsLog;
create table EvtsLog
( id int auto_increment primary key,
incarnationId int not null, -- See NoteA (above)
evtName varchar(20) not null, -- allows for use of this table by multiple events
step int not null, -- facilitates reporting on event level performance
debugMsg varchar(1000) not null,
dtWhenLogged datetime not null
-- tweak this with whatever indexes your can bear to have
-- run maintenance on this table to rid it of unwanted rows periodically
-- as it impacts performance. So, dog the rows out to an archive table or whatever.
);
Turn on Events:
show variables where variable_name='event_scheduler'; -- OFF currently
SET GLOBAL event_scheduler = ON; -- turn her on
SHOW EVENTS in so_gibberish; -- confirm
Confirm Evt is firing:
SELECT * FROM EvtsLog WHERE step=1 ORDER BY id DESC; -- verify with our sonar

For more details of those helper tables, visit those links off my profile page for Events. Pretty much just the one link for Performance Tracking and Reporting.
You will also note that it is of no concern at the moment of having any data in the actual tables that you were originally focusing on. That can come later, and can be reported on in the evt log table by doing a custom string CONCAT into a string variable (for the counts etc). And reporting that in a step # like step 10 or 20.
The point is, you are completely blind without something like this as to know what is going on.