So i'm seeing some very odd behaviour that i can't work out what's going wrong and hoping someone can help. Originally i was seeing this in a bigger application with lots of values, but i've stripped it down to a simple example.
Test code:
[Test]
public void DecimalIssue()
{
decimal decimalValueToSet = 0.00003110M;
var modelToAdd = new TestModel { Value = decimalValueToSet };
TestModel addedEntity = _db.Add(modelToAdd);
int idToUse = addedEntity.Id;
decimal? addedValue1 = addedEntity.Value;
//int idToUse = 1;
TestModel entity = _db.GetById(idToUse);
decimal? value1 = entity.Value; // Sometimes value1 is 0.00000000 and sometimes it's 0.00003110
decimal value2 = entity.Value.Value;
decimal? value3 = Convert.ToDecimal(entity.Value);
double value4 = Convert.ToDouble(entity.Value);
decimal? value5 = decimalValueToSet;
entity.Value = decimalValueToSet;
TestModel updatedEntity = _db.Update(entity);
decimal? updatedValue = updatedEntity.Value;
TestModel newEntity = _db.GetById(idToUse);
decimal? newValue = newEntity.Value;
}
Steps to reproduce issue:
- Start by adding the table through the SQL Database Project and VS schema comparison
- Run the test to add the value to the db. Everything looks correct when debugging the test
- Check the db table in SSMS. The value shows as 0.00000000
- Run the same test, but comment out the add section and uncomment idToUse = 1
- Now I'm seeing odd code issues, like entity.Value is 0.00003110, but value1 - value4 are all 0.00000000, but updatedValue and newValue are correct.
- The DB value is still 0.00000000 after the test finishes
- Directly update the DB value in SSMS to 0.00003110. Run the test again and now everything is correct in the test and the DB value in SSMS is correct afterwards
- Adding a new value acts the same as the first did
- Delete the table and create it again and everything is back to square 1
Environment details:
- SQL Database Project - DB Schema updated through a VS schema comparison
- Azure SQL Database
- .net 5
- EF6
- VS 2019 - 16.9.4
- SSMS - 18.9.1
Not sure if this will help, but most values i add show as 0.00000000 in SSMS, but some values show differently. If for example i instead use 0.50516090M, then it shows in SSMS as 0.50000000.
Model:
[Table("Test")]
public class TestModel
{
[Key]
public int Id { get; set; }
public decimal? Value { get; set; }
}
DB schema:
CREATE TABLE [dbo].[Test] (
[Id] INT IDENTITY (1, 1) NOT NULL,
[Value] DECIMAL (12, 8) NULL,
CONSTRAINT [PK_Test] PRIMARY KEY CLUSTERED ([Id] ASC)
);