9

In a unidirectional many-to-many relationship between Registration and Item, where a Registration has an ISet<Item> ItemsPurchased and Item has no reference back to registrations (it's not a useful way to explore the object graph), when I look at the SQL being generated, I see

INSERT INTO Registrations_Items (RegistrationId, ItemId) VALUES (@p0, @p1);@p0 = 1 [Type: Int32 (0)], @p1 = 1 [Type: Int32 (0)]
UPDATE Items SET Price = @p0, Name = @p1, [...], ListIndex = @p5, EventId = @p6 WHERE ItemId = @p7

The parameters passed to the update are correct, but nothing about the Item has changed, so the update is not needed.

Mapping is by automapping with this override in place for Registration and no overrides for Item. DB Schema looks completely correct. I removed all conventions and tested again and the behavior persisted, so it's not any of my mapping conventions that are doing this.

mapping.HasManyToMany(e => e.ItemsPurchased).AsSet().Cascade.All().Not.Inverse();

Why is NHibernate making this UPDATE call and what can I do to stop it? It's not really hurting anything but it suggests that I did something wrong, so I'd like to figure out what.

Edit: Per comment below, I created a unit test which creates an Event (Item must belong to an Event), adds two Items to it, evicts the first from session and flushes session, then Gets the first back by its ID.

I notice something odd in the SELECT items line below (2nd from bottom)

INSERT INTO Events (blah blah blah...)
select @@IDENTITY
INSERT INTO Items (Price, Name, StartDate, EndDate, ExternalID, ListIndex, EventId) VALUES (@p0, @p1, @p2, @p3, @p4, @p5, @p6);@p0 = 100.42 [Type: Decimal (0)], @p1 = 'Item 1' [Type: String (0)], @p2 = NULL [Type: DateTime (0)], @p3 = NULL [Type: DateTime (0)], @p4 = '123' [Type: String (0)], @p5 = 0 [Type: Int32 (0)], @p6 = 1 [Type: Int32 (0)]
select @@IDENTITY
SELECT blah blah blah FROM Events event0_ WHERE event0_.EventId=@p0;@p0 = 1 [Type: Int32 (0)]
SELECT itemsforsa0_.EventId as EventId1_, itemsforsa0_.ItemId as ItemId1_, itemsforsa0_.ListIndex as ListIndex1_, itemsforsa0_.ItemId as ItemId3_0_, itemsforsa0_.Price as Price3_0_, itemsforsa0_.Name as Name3_0_, itemsforsa0_.StartDate as StartDate3_0_, itemsforsa0_.EndDate as EndDate3_0_, itemsforsa0_.ExternalID as ExternalID3_0_, itemsforsa0_.ListIndex as ListIndex3_0_, itemsforsa0_.EventId as EventId3_0_ FROM Items itemsforsa0_ WHERE itemsforsa0_.EventId=@p0;@p0 = 1 [Type: Int32 (0)]
UPDATE Items SET Price = @p0, Name = @p1, StartDate = @p2, EndDate = @p3, ExternalID = @p4, ListIndex = @p5, EventId = @p6 WHERE ItemId = @p7;@p0 = 100.42000 [Type: Decimal (0)], @p1 = 'Item 1' [Type: String (0)], @p2 = NULL [Type: DateTime (0)], @p3 = NULL [Type: DateTime (0)], @p4 = '123' [Type: String (0)], @p5 = 0 [Type: Int32 (0)], @p6 = 1 [Type: Int32 (0)], @p7 = 1 [Type: Int32 (0)]

The table is created correctly:

create table Items (
    ItemId INT IDENTITY NOT NULL,
   Price NUMERIC(19,5) not null,
   Name NVARCHAR(255) not null,
   StartDate DATETIME null,
   EndDate DATETIME null,
   ExternalID NVARCHAR(255) not null,
   ListIndex INT not null,
   EventId INT not null,
   primary key (ItemId)
)

The DateTimes are deliberately nullable because an item might not need to be date-specific (an example of something that is would be "early-bird registration").

Carl Bussema
  • 1,684
  • 2
  • 17
  • 35
  • To confirm a phantom update issue, `Get` that same item and check if an update is issued on flush/commit. Then you can rule out any many-to-many cascading issue. – dotjoe May 30 '12 at 20:51
  • Could this be related to having a decimal field? Google turns up some results for Phantom Updates and decimal, but I haven't gotten to the bottom of what they all say yet. – Carl Bussema May 31 '12 at 13:13

2 Answers2

10

This is called: Phantom Updates, it is usually related with the mapping of your objects

This is the primary cause:

Imagine we have an object like this

public class Product
{
   public Guid Id { get; set; }
   public int ReorderLevel { get; set; }
   public decimal UnitPrice { get; set; }
}

and a map:

public class ProductMap : ClassMap<Product>
{
   public ProductMap()
   {
      Not.LazyLoad();
      Id(x => x.Id).GeneratedBy.GuidComb();
      Map(x => x.ReorderLevel);
      Map(x => x.UnitPrice).Not.Nullable();
   }
}

Note that the ReorderLevel will accept nulls

If you save this entity without specifying a ReorderLevel it will be saved with a null value, but then when you load it back from the database, since the ReorderLevel type is int, a 0 will be added which will cause the entity to be marked as dirty and therefore it will cause an update

These kind of errors are difficult to detect and track, I recommend you to use Nullable<> types when you really want a null in the database

The way I usually accomplish this is to create a convention that will automatically set my Value Types to null if they are declared with Nullable<>, otherwise the field will be marked as NotNullable

Just to complement, this is how my convention looks like:

    mapper.BeforeMapProperty += (ins, memb, cust) =>
    {
        var type = memb.LocalMember.GetPropertyOrFieldType();

        if (type.IsValueType)
        {
            if (type.IsGenericType && type.GetGenericTypeDefinition() == typeof(Nullable<>))
            {
                cust.Column(x => { x.NotNullable(notnull: false); });
            }
            else
            {
                cust.Column(x => { x.NotNullable(notnull: true); });
            }
        }
    }
Jupaol
  • 21,107
  • 8
  • 68
  • 100
  • I already have a `ColumnNullConvention' which is ensuring that any field marked with `DataAnnotations.Required` is declared not null. – Carl Bussema May 31 '12 at 12:51
  • I have edited the original question to show the SQL for a simple create-evict-flush-get cycle and included the table mapping. – Carl Bussema May 31 '12 at 13:03
  • 1
    The problem is with the Price `field`, as you can see, it is inserted the value 100.42 but then it is updated with 100.42000 I think that's marking the entity as dirty. Do you have custom logic to format the `Price` field?, also could you update the post with the mapping of the `Price` field and its entity? – Jupaol May 31 '12 at 14:15
  • The Price field is a C# `decimal?` with a simple get/set and no custom logic. Views are responsible for formatting it for display. Its mapping is now mapping.Map(i => i.Price).Precision(6).Scale(2); although that hasn't changed anything. – Carl Bussema May 31 '12 at 14:29
  • Just one observation if price is `decimal?` why it's being marked as `NotNull` in the database – Jupaol May 31 '12 at 14:34
  • I made some tests trying to replicate this behavior but I am not getting the phantom update, the only way to get an update is to explicitly change a field in my entity I changed my mapping of a decimal property to: `c.Precision(6);c.Scale(2);c.Column(z => z.SqlType("NUMERIC(19,5)"));` (I had to specify the sql type cos my conventions automatically map my decimal to DECIMAL in the database and you have NUMERIC instead). What version of NH and FNH are u using? – Jupaol May 31 '12 at 14:39
  • Typo, it's not actually `decimal?` it is `decimal` since an Item without a Price would be rather odd. This is NH 3.3.0.4000 and FNH 1.3.0.727. So I dug around... if I run the test `CanGenerateDatabaseSchema` which just creates a `AutoPersistanceModelGenerator` and runs it, I get `Price DECIMAL(6, 2) not null` but if I run only the unit test that's failing I get ` Price NUMERIC(19, 0) not null`. I think I have an answer... – Carl Bussema May 31 '12 at 14:56
1

As noted above (below? who knows. Look for the comment I left on the other answer), I noticed that the difference between the CanGenerateDatabaseSchema unit test and the CanGetItem unit test was that one was giving me DECIMAL (6,2) and the other was giving me DECIMAL (19,0).

I poked around more and realized that CanGenerateDatabaseSchema was using my "real" config (from the Web project) and the other test was using my "unit test" config. My unit tests were being run against Sql Server CE ... when I changed my unit tests to use the same configuration as my real database (Sql Server 2005), suddenly the phantom update went away.

So if anyone else runs into unexpected Phantom Updates with decimals... check if you're using Sql Server CE. Since the test is actually passing (the comment that says it's failing is incorrect, it's not failing, just doing extra work), I guess I'll live with it, although why Sql CE is ignoring my configuration is a good question, and a possible NH or FNH bug.

Carl Bussema
  • 1,684
  • 2
  • 17
  • 35