I'm working with a Rails 4 app that needs to create a large number of objects in response to events from another system. I am getting very frequent ActiveRecord::RecordNotUnique
errors (caused by PG::UniqueViolation
) on the primary key column when I call create!
on one of my models.
I found other answers on SO that suggest rescuing the exception and calling retry
:
begin
TableName.create!(data: 'here')
rescue ActiveRecord::RecordNotUnique => e
if e.message.include? '_pkey' # Only retry primary key violations
log.warn "Retrying creation: #{e}"
retry
else
raise
end
end
While this seems to help, I am still getting tons of ActiveRecord::RecordNotUnique
errors, for sequential IDs that already exist in the database (log entries abbreviated):
WARN -- Retrying creation: PG::UniqueViolation: DETAIL: Key (id)=(3067) already exists.
WARN -- Retrying creation: PG::UniqueViolation: DETAIL: Key (id)=(3068) already exists.
WARN -- Retrying creation: PG::UniqueViolation: DETAIL: Key (id)=(3069) already exists.
WARN -- Retrying creation: PG::UniqueViolation: DETAIL: Key (id)=(3070) already exists.
The IDs it's trying are in the 3000-4000 range, even though there are over 90000 records in the table in question.
Why is ActiveRecord or PostgreSQL wasting so much time sequentially trying existing IDs?
The original exception (simplified/removed query string):
{
"exception": "ActiveRecord::RecordNotUnique",
"message": "PG::UniqueViolation: ERROR: duplicate key value violates unique constraint \"table_name_pkey\"\nDETAIL: Key (id)=(3023) already exists."
}