1

Is decimal a primitive? Originally I thought that it is because Microsoft defines it as a primitive over here but then I was using the Type.IsPrimitive method and it returned false on decimals (Type.IsPrimitive). Did they just mess up or am I really missing something here? This question's accepted answer is circular "To me, it seems as though decimal is a type that must exist for a language/runtime wanting to be CLS/CLI-compliant (and is hence termed "primitive" because it is a base type with keyword support), but the actual implementation does not require it to be truly "primitive" (as in the CLR doesn't think it is a primitive data type)." I understand that keyword support alone won't make something a primitive i.e. string is a reference type but still, under what definition of primitives does the CLR use to say decimal is not a primitive? It doesn't make sense that the only reason something is a primitive is because the CLR accepts it to be so. According to that if the CLR will accept decimals as primitives then they are now defined as primitives?

Community
  • 1
  • 1
A_Arnold
  • 3,195
  • 25
  • 39
  • OK, suppose for the sake of argument that Microsoft gets their act together and clearly and unambiguously says "decimal is not primitive". What will you now do differently in your life? Suppose they say "decimal is primitive". Again, what will you do differently? I am trying to understand why you asked this question. – Eric Lippert Jan 10 '16 at 07:00
  • A language designer's choice of a primitive type does not have to match the implementer's choice. Pretty important that decimal is primitive in C# and VB.NET, the compiler needs to be aware of implicit conversions. Not primitive in the CLR, above all the vast majority of processors do not support them. Nor does the CLI spec dictate what it looks like internally. Actual implementation came from COM and predated .NET by many years. – Hans Passant Jan 10 '16 at 07:43

0 Answers0