We are using EF Core, but we also upgrade pre-ef core databases to the latest version. On Postegres there is a table:
CREATE TABLE "DatabaseVersion"
(
"DatabaseVersionId" integer NOT NULL GENERATED BY DEFAULT AS IDENTITY ( INCREMENT 1 START 1 MINVALUE 1 MAXVALUE 2147483647 CACHE 1 ) PRIMARY KEY,
"Version" varchar(20) NOT NULL,
"VersionDate" timestamptz NULL DEFAULT Now(),
"Description" varchar(2000) NULL,
"AppliedDate" timestamptz NULL DEFAULT Now()
);
In PgAdmin, I create a database and this table, then I insert into the table for a test using:
INSERT INTO "DatabaseVersion" ("Version")
VALUES ('1.0.0.0');
and I get the first row entered then I can change the version in the insert statement and insert more rows. Each row has a unique DatabaseVersionId
.
However, when I run the upgrade and attempt to insert a value using EF Core, I get an exception
duplicate key value violates unique constraint
I created a new database to see how EF Core defined the key and it is:
GENERATED BY DEFAULT AS IDENTITY (INCREMENT 1 START 1 MINVALUE 1 MAXVALUE 2147483647 CACHE 1).
I added primary key to that to avoid adding a lot of primary key constraints to our older scripts.
Our update code is shown here (T
would be DatabaseVersion
):
var dbSet = context.Set<T> ();
var id = GetKey (context, item);
if (id == 0)
{
dbSet.Add (item);
}
else
{
var p = dbSet.Find (id);
dbSet.Attach (p);
var a = context.Entry (p);
if (a != null)
a.CurrentValues.SetValues (item);
}
context.SaveChanges ();
What needs to be done to older scripts to make the identity primary key work?
Or is that even the problem? I am not a postgres expert.
Our database upgrade uses scripts until it gets to a certain version, then we run each migration individually. EF Core figures out not to run the initial create migration on its own, but it still adds an entry in the __EFMigrationsHistory
table, which is great.
Update
In my searches I found this question. During our update and create of a new database we insert data that has the primary key already set, so with postgresql the sequence is not updated and on your first insert your get a duplicate key exception. To get around this after a create or upgrade of a databse you can use the following script:
@"SELECT setval(
(SELECT pg_get_serial_sequence( '""<%SchemaName%>"".""<%TableName%>""', '<%KeyName%>')),
(SELECT MAX(""<%KeyName%>"") from ""<%SchemaName%>"".""<%TableName%>""),
true );";
Using this in a loop with table and primary key name all sequences can be set to their proper value.