I have a database table on which I am using INSERT ... ON DUPLICATE KEY UPDATE
, where the unique check has been applied on a non-primary key field.
Initially I did not notice, but as the database has grown and has almost 500k records, I notice that the primary key value has gone too high.
Currently I have 507 811 records in the table, but the highest primary key right now is: 4 841 123, which is way higher than it should be.
As quoted here:
"INSERT ... ON DUPLICATE KEY UPDATE works similarly to REPLACE in that when a row with an existing (duplicate) key is found, it's actually removed (1 row affected) and then a new row created in its place (1 more row affected), giving the illusion of updating/replacing a row.",
so should I really use primary key field, because eventually it will go out range one day as the data keeps increasing.
I know it is not a bug, but I really need some more knowledge into this. I have been searching a lot about this. And I realise, either I should just keep the column and remove primary key check and auto-increment from it or increase the size for the column, currently it is int(11).