I'm currently working on a Laravel 8 application, backed by a PostgreSQL database, in which I'm generating a Cost
model for various different items. My intention was to record a maximum of one Cost->value
per day, per item; however, due to some issues with overlapping jobs and the way in which I was using the updateOrCreate()
method, I've ended up with multiple Cost
records per day for each item.
I've since fixed the logic so that I'm no longer getting multiple records per day, but I'd like to now go back and clean up all of the duplicate records.
Is there an efficient way to delete all of the duplicate records per item, leaving the newest record for each day, i.e.: leaving no more than one record per item, per day? While I'm sure this seems pretty straight-forward, I can't seem to land on the correct logic either directly in SQL, or through Laravel and PHP.
Maybe relevant info: Currently, there's ~50k records in the table.
Example table
// Example database table migration
Schema::create('costs', function (Blueprint $table) {
$table->id();
$table->string('item');
$table->decimal('value');
$table->date('created_at');
$table->timestamp('updated_at');
});
Rough Example (Before)
id,item,value,created_at,updated_at
510,item1,12,2021-07-02,2021-07-02 16:45:17 126.5010838402907751
500,item1,13,2021-07-02,2021-07-02 16:45:05 126.5010838402907751
490,item1,13,2021-07-02,2021-07-02 16:45:01 126.5010838402907751
480,item2,12,2021-07-02,2021-07-02 16:44:59 126.5010838402907751
470,item2,14,2021-07-02,2021-07-02 16:44:55 126.5010838402907751
460,item2,12,2021-07-02,2021-07-02 16:44:54 126.5010838402907751
450,item2,11,2021-07-02,2021-07-02 16:44:53 126.5010838402907751
Rough Example (Desired End-State)
id,item,value,created_at,updated_at
510,item1,12,2021-07-02,2021-07-02 16:45:17 126.5010838402907751
480,item2,12,2021-07-02,2021-07-02 16:44:59 126.5010838402907751