0
 public class Invoice
    {
        public decimal Amount { get; set; }
    }

I know that M is the suffix for decimal values

  public class Invoices
    {
        public void IdentityTest()  
        {
            Invoice firstInvoice = new Invoice();  
            firstInvoice.Amount = 0.0M;
        }    
    }

What is the point of declaring class properties as decimal when you have to use M with a value which you expect to be decimal?

Rene
  • 622
  • 3
  • 11
  • 27
  • The same reason you'd declare a string field, then store a string in it rather than, say, an integer? Implicit casting saves the day here and (in this case), I'd expect the compiler to fix things, but the principle still holds. – spender Apr 09 '13 at 12:31

3 Answers3

2

You need to tell the compiler the type of the literal - it is not safe to infer it as you may have wanted to use the default type for the literal (double in this case).

This will work if there is an implicit conversion between the two, but if there isn't you need to specify a cast/conversion otherwise.

Oded
  • 489,969
  • 99
  • 883
  • 1,009
0

If you don't use the suffix M then the literal will be treated as a double. Assuming you don't want a double, this could result in a compiler error.

DGibbs
  • 14,316
  • 7
  • 44
  • 83
0

From MSDN:

Without the suffix m, the number is treated as a double, thus generating a compiler error.

http://msdn.microsoft.com/en-us/library/364x0z75(v=vs.80).aspx

Joey Gennari
  • 2,361
  • 17
  • 26