0

I am trying to set a cron job script that checks for and deletes records with timestamps older than 15 minutes, however, the below script is not working and I have not a clue as to why. I am unfamiliar with date comparisons, tried to look online for solutions to my goals, but none of them are working. Scripting that accomplishes would be fantastic! Thank you in advance!

<?php

include('../../db/db.php');

mysql_select_db($database_conn_abrv, $con);

$timetocheck = date('Y-m-d H:i:s', mktime(date("H"), date("i") - 15, date("s"), date("m"), date("d"), date("Y"));

mysql_query("DELETE FROM loginbans WHERE time_created>$timetocheck",$con) or die(mysql_error());

?>
Dharman
  • 30,962
  • 25
  • 85
  • 135
  • Possible duplicate of [How do I delete rows of data from mysql table automatically with 24 hours after data into table?](https://stackoverflow.com/questions/44253746/how-do-i-delete-rows-of-data-from-mysql-table-automatically-with-24-hours-after) – Alexander Dec 27 '18 at 16:36

2 Answers2

0

You can use INTERVAL X MINUTE for MySQL:

DELETE FROM loginbans
WHERE time_created > (NOW() - INTERVAL 15 MINUTE)
GGio
  • 7,563
  • 11
  • 44
  • 81
0

Why do you want to delete the records? That requires a scheduled job that runs every few minutes. This seems like a burden on the system. It also requires that the clocks really be synchronized on the system with the database and where the cron job is running (admittedly, these would normally be the same computer, but not necessarily).

Instead, you can create a view on a table that only gets records from the last 15 minutes:

create view v as
    select lb.*
    from loginbans lb
    where time_created >= now() - interval 15 minute;

Or, you could add an isdeleted flag into the view:

create view v as
    select lb.*, (time_created >= now() - interval 15 minute) as isdeleted
    from loginbans lb;

If you are generating lots and lots of data in every 15 minute interval (think at least tens of thousands of records), I would suggest that you look into partitions. You can partition the data by time_created and drop the partitions periodically. Dropping a partition is much less expensive than dropping rows.

Gordon Linoff
  • 1,242,037
  • 58
  • 646
  • 786