I was told that decimal is implemented as user defined type and other c# types like int have specific opcodes devoted to them. What's the reasoning behind this?
2 Answers
decimal
isn't alone here; DateTime
, TimeSpan
, Guid
, etc are also custom types. I guess the main reason is that they don't map to CPU primatives. float
(IEEE 754), int
, etc are pretty ubiquitous here, but decimal
is bespoke to .NET.
This only really causes a problem if you want to talk to the operators directly via reflection (since they don't exist for int etc). I can't think of any other scenarios where you'd notice the difference.
(actually, there are still structs etc to represent the others - they are just lacking most of what you might expect to be in them, such as operators)

- 1,026,079
- 266
- 2,566
- 2,900
-
1I think the point is that decimal is the only type which gets its own keyword in C# but isn't treated specially by the CLR. – Jon Skeet Dec 04 '08 at 06:27
"What's the reasoning behind this?"
Decimal math is handled in software versus hardware. Currently, many processors don't support native decimal (financial decimal versus float) math. That's changing though with the adoption of IEEE 754R.
See also:

- 1
- 1

- 25,526
- 6
- 73
- 100