2

In my project I have entity model that looks like:

public class OrdersDbContext : DbContext
{
    public DbSet<Customer> Customers { get; set; }
    public DbSet<Order> Orders { get; set; }
    public DbSet<OrderPosition> OrderPositions { get; set; }
}

public class Customer
{
    public Guid Id { get; set; }
    public virtual Order LiveOrder { get; set; }
}

public class Order
{
    public Guid Id { get; set; }
    public virtual IList<OrderPosition> Positions { get; set; }
    public string Name { get; set; }
}

public class OrderPosition
{
    public Guid Id { get; set; }
    public int Price { get; set; }
}

And I need to provide a method that will update some Customer's properties as well as his LiveOrder and all OrderPositions (Insert new, update existing and delete old). I tried to do it in several ways, but all of them failed:

  1. Delete order and order positions and than insert new failed with
  2. Duplicate key exception. Detatch order and than attach updated - failed with:

Attaching an entity of type 'OrderPosition' failed because another entity of the same type already has the same primary key value.

What is the right way of doing it?

Full console program that demonstrate the issue:

using System;
using System.Collections.Generic;
using System.Data.Entity;
using System.Linq;
public class OrdersDbContext : DbContext
{
    public DbSet<Customer> Customers { get; set; }
    public DbSet<Order> Orders { get; set; }
    public DbSet<OrderPosition> OrderPositions { get; set; }
}

public class Customer
{
    public Guid Id { get; set; }
    public virtual Order LiveOrder { get; set; }
}

public class Order
{
    public Guid Id { get; set; }
    public virtual IList<OrderPosition> Positions { get; set; }
    public string Name { get; set; }
}

public class OrderPosition
{
    public Guid Id { get; set; }
    public int Price { get; set; }
}

class Program
{
    static void Main(string[] args)
    {
        var parentId = Guid.NewGuid();
        var childId = Guid.NewGuid();

        using (var ctx = new OrdersDbContext())
        {
            var agg = new Customer{ 
                Id = Guid.NewGuid(),
                LiveOrder = new Order
            {
                Id = parentId,
                Name = "First order.",
                Positions = new[] { new OrderPosition { Id = childId, Price = 3 } }
            }};
            ctx.Customers.Add(agg);
            ctx.SaveChanges();
        }

        var newParent = new Order
        {
            Id = parentId,
            Name = "Updated order.",
            Positions = new[] { new OrderPosition { Id = childId, Price = 5 } }
        };
        try
        {
            using (var ctx = new OrdersDbContext())
            {
                var agg = ctx.Customers.First(x => x.LiveOrder.Id == parentId);

                ctx.OrderPositions.RemoveRange(agg.LiveOrder.Positions);
                var parent = agg.LiveOrder;
                agg.LiveOrder = null;
                ctx.Entry(parent).State = EntityState.Detached;
                Console.WriteLine(ctx.Entry(parent).State);
                ctx.Entry(newParent).State = EntityState.Modified;
                agg.LiveOrder = newParent;

                ctx.SaveChanges();
            }
        }
        catch (Exception ex)
        {
            Console.WriteLine(ex);
        }
        Console.WriteLine("Press any key to exit.");
        Console.ReadLine();
    }
}
Oleg Oshkoderov
  • 500
  • 1
  • 4
  • 17
  • As far as I know, Enities will not bulk update/insert so if you are changing the order position you will always end up with duplicated values. One method I have seen is that you use floats for your order with randomly generated fractions while still keeping the integer part incremental – Michael Coxon Nov 10 '14 at 15:48
  • I have managed to update it with additional SaveChanges after deleting Order and OrderPositions and than inserting it. But I fill that there should be a better way of doing it with single SaveChanges. I think I will be able to detach all the OrderPosition's as well and than attach new/updated. But in this case I fill's like I'm doing ERM's job. – Oleg Oshkoderov Nov 10 '14 at 15:52
  • I copied my comment to an answer as I felt it offered a new way of doing it. – Michael Coxon Nov 10 '14 at 15:53
  • First point... do it in a transaction!... what if one of your changes fails? but some others succeed? Corrupt data? – Paul Zahra Nov 10 '14 at 16:13
  • Why aren't you calling save changes after `agg.LiveOrder = null;`? The database is not going to be able to handle that. You are detaching the Entity store from the DB and creating a new object in the entity store which is not going to link up to the object in the database. The issue you are having here is that the row still exists in the db, on the client side you are removing it, then telling the entity store to forget about it, then adding a new one to the entity store, then committing it. This will make EF do an insert instead of an update, with the same Guid causing a Duplicate Key error. – Michael Coxon Nov 10 '14 at 16:19
  • You might have a look at my answer on [ASP.NET MVC - Attaching an entity of type 'MODELNAME' failed because another entity of the same type already has the same primary key value](http://stackoverflow.com/questions/23201907/asp-net-mvc-attaching-an-entity-of-type-modelname-failed-because-another-ent/39557606#39557606). – Murat Yıldız Sep 18 '16 at 12:29

3 Answers3

1

This is totally not an EF problem - you run into the same using pure SQL.

SQL Server checks uniqueness on every update, not on commit. As a reorder temporarily creates non-unique.... well...

You need 2 passes. First move the stuff out of the way, then update to final position.

How? Well, the SQL Server data types support negative numbers ;) Put them into their position, negative (i.e. -4 instead off 4) and then you can make a SP (or direct SQL) that inverts the negative numbers for the customer. Finished.

But you need to break uniqueness during the updates.

TomTom
  • 61,059
  • 10
  • 88
  • 148
0

Alas Entity Framework is pretty dumb when it comes to this type of operation, it gives you the framework, not a total solution.

As you have done; This is achievable via multiple calls to SaveChanges.

The reason it cannot be achieved in one call to SaveChanges is that there is no way of telling Entity Framework the order you want things to be actioned, i.e. it might delete one table record before it changes it foreign key etc.

Now the better way to perform multiple save changes is within a transaction as follows:

TransactionScope scope = new TransactionScope(TransactionScopeOption.RequiresNew, new TransactionOptions() { IsolationLevel = IsolationLevel.Serializable });

try
{
    using (scope)
    {
        using (OrdersDbContext ctx = new OrdersDbContext())
        {
            ctx.Connection.Open();

            //Do something with DBcontext - Update existing
            //Save Changes
            ctx.SaveChanges(false);

            //Do something else with DBcontext - Delete old
            //Save Changes
            ctx.SaveChanges(false);

            //Do something else with DBcontext - Insert new
            //Save Changes
            ctx.SaveChanges(false);

            //if we get here things are looking good.
            scope.Complete();
            ctx.AcceptAllChanges();
        }

        scope.Complete();
    }
}
catch (TransactionAbortedException ex)
{
    ErrorLogger.LogException();
}
catch (ApplicationException ex)
{
    ErrorLogger.LogException();
}

The major benefits here being that if one SaveChanges fails then the others are rolled back / not actioned.

While the call to SaveChanges(false) sends the necessary commands to the database, the context itself is not changed, so you can do it again if necessary, or you can interrogate the ObjectStateManager if you want.

This means if the transaction actually aborts you can compensate, by either re-trying or logging state of the context ObjectStateManager somewhere.

Paul Zahra
  • 9,522
  • 8
  • 54
  • 76
  • Not OP, but can you explain this more/link somewhere that explains what it is you are doing here, as an interested onlooker I don't actually understand what is happening or what the 2 contexts might relate to. – Michael Coxon Nov 10 '14 at 16:05
  • Ok... so is it all about delaying the commit? – Michael Coxon Nov 10 '14 at 16:06
  • It's more about specifying exactly when operations which could break primary / foreign key relationships etc occur... so you do things in an order which will not break the db rules. – Paul Zahra Nov 10 '14 at 16:09
  • If you alter multiple entities in the context and then call save changes... if some of the changes you made will break the DB then you are lucky if it works... Once you call SaveChanges EF will order the actual DB changes however it likes for optimisation purposes I believe... so you must do things one at a time, in the order that won't break the DB. – Paul Zahra Nov 10 '14 at 16:20
0

To extend my comment above...

The database and EF is not going to be able to handle that. You are detaching the Entity store from the DB and creating a new object in the entity store which is not going to link up to the object in the database.

The issue you are having here is that the row still exists in the db, on the client side you are removing it, then telling the entity store to forget about it, then adding a new one to the entity store, then committing it. This will make EF do an insert instead of an update, with the same Guid causing a Duplicate Key error.

What you should do here is the following...

    try
    {
        using (var ctx = new OrdersDbContext())
        {
            var agg = ctx.Customers.First(x => x.LiveOrder.Id == parentId);

            var pos = agg.LiveOrder.Single(o => o.Id == childId);
            pos.Price = 5;

            ctx.SaveChanges();
        }
    }
    catch (Exception ex)
    {
        Console.WriteLine(ex);
    }

Entities stores the objects client side and will detect changes to them. All you have to do is modify them like normal objects and call SaveChanges().

Michael Coxon
  • 5,311
  • 1
  • 24
  • 51