26

I'm getting an annoying inconsistency when I'm writing decimals to json using json.net. Sometimes it's to 1 dp, other times 2.

Obviously I'm aware of solutions to output decimals to strings with a certain number of decimals such as this, but you don't have that control using json.net without writing a custom serializer I guess.

I am also aware of Math.Round to enforce a maximum number of decimal places, this question relates to enforcing a minimum number of decimal places.

The first two tests show what is happening, it is keeping the original number of decimal places from the declaration or calculation.

I found I can add and then subtract a small fraction which the next two tests show working, but is there a cleaner way?

[TestFixture]
public sealed class DecimalPlaces
{
    public class JsonType
    {
        public decimal Value { get; set; }
    }

    [Test]
    public void TwoDp()
    {
        var obj = new JsonType { Value = 1.00m };
        Assert.AreEqual("{\"Value\":1.00}", JsonConvert.SerializeObject(obj));
    }

    [Test]
    public void OneDp()
    {
        var json = new JsonType { Value = 1.0m };
        Assert.AreEqual("{\"Value\":1.0}", JsonConvert.SerializeObject(obj));
    }

    private decimal ForceMinimumDp(decimal p, int minDecimalPlaces)
    {
        decimal smallFrac = 1m/((decimal)Math.Pow(10, minDecimalPlaces));
        return p + smallFrac - smallFrac;
    }

    [Test]
    public void ForceMinimumTwoDp()
    {
        var obj = new JsonType { Value = ForceMinimumDp(1.0m, 2) };
        Assert.AreEqual("{\"Value\":1.00}", JsonConvert.SerializeObject(obj));
    }

    [Test]
    public void ForceMinimumThreeDp()
    {
        var obj = new JsonType { Value = ForceMinimumDp(1.0m, 3) };
        Assert.AreEqual("{\"Value\":1.000}", JsonConvert.SerializeObject(obj));
    }
}
Community
  • 1
  • 1
weston
  • 54,145
  • 21
  • 145
  • 203
  • 1
    `1`, `1.0` and `1.00` are generally treated as equivalent in JSON. Is this just to make for a prettier display, or does your JSON parser really treat them as different? Also, adding and subtracting `0.01` forces *at least* two decimals, but it won't round numbers with more decimals. Is that what you're after? Your question suggests to me you do want some sort of rounding, by asking to force a certain number of decimal places. –  Jan 02 '16 at 18:03
  • @hvd No, it's just I'm comparing json outputs manually and because decimal place accuracy is switching on me it's annoying. So yeah it's for prettiness reasons. – weston Jan 02 '16 at 18:07
  • @hvd "but it won't round numbers with more decimals" No I know how to do that. Consider these numbers already rounded to no more decimal places than I want. – weston Jan 02 '16 at 18:07
  • @hvd I have changed title to "How can I force a minimum number of decimal places..." – weston Jan 02 '16 at 18:10
  • @hvd well I'm expecting other changes, so deep equals will be false. When I do the file compare, I don't want to be hit with all these decimal places changes, just want to check the expected changes are OK. – weston Jan 02 '16 at 18:19
  • I deleted my previous comment after I noticed that Json.NET *does* appear to treat different number types as unequal. As for your last comment, you don't just want to know whether two JSON strings represent the same values (you already know they won't), you want to find what the differences are? Okay, then yeah, even if the method I hinted at would work as I expected, it wouldn't cover what you want. –  Jan 02 '16 at 18:22
  • See this answer: https://stackoverflow.com/questions/46684557/preserve-remove-trailing-zeros-in-newtonsoft-json/62594755#62594755 – Joel Wiklund Jun 26 '20 at 12:30

2 Answers2

30

You can do it with a custom JSON converter:

class DecimalJsonConverter : JsonConverter
{
    public override bool CanConvert(Type objectType)
    {
        return objectType == typeof (decimal);
    }

    public override object ReadJson(JsonReader reader, Type objectType, object existingValue,
        JsonSerializer serializer)
    {
        throw new NotImplementedException();
    }

    public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
    {
        writer.WriteRawValue(((decimal) value).ToString("F2", CultureInfo.InvariantCulture));
    }
}

This is a very basic converter. You may need to extend it to support other floating-point types, or perhaps even integer types too.

Now instantiate your serialiser and pass it your custom converter, like so:

var serializer = new JsonSerializer();
serializer.Converters.Add(new DecimalJsonConverter());
1

I know this is an old question but I had a similar problem where I am automating the verification of a JSON API responses, where I needed to force a minimum of 3 decimal places. I thought that I'd leave my somewhat generic solution here as it may help someone and save them time if they have a similar issue.

The actual JSON returned zero padded to 3 decimal places if there were only 1 or 2 digits after the decimal place.

My expected verification data then needed to also zero pad e.g.

10.3   pad to 10.300
10.34  pad to 10.340
10.345 leave as is as will match the actual

I built up my expected in the following manner, which solved the problem for me:

 public static decimal GetRate(
            string fromCurrency,
            string toCurrency,
            decimal rawRate,
            decimal margin)
        {
            if (fromCurrency == toCurrency) return 1m;

        var _rate = Math.Round(rawRate * (1 + (margin / 100)), 7);
        var _numberOfPlacesAfterDecimalPlace = _rate.ToString().Split('.')[1].Length;

        // NOTE: Software API response stores precision value to 3 decimal places, need to cater 
        // for that here for currency pairs where the resulting rate has less than 3 decimal places.
        // This will zero pad the result after the decimal place to 3 places
        if (_numberOfPlacesAfterDecimalPlace >= 3) { return _rate; }
        return Decimal.Parse(string.Format("{0:F3}", _rate));
    }

My solution avoided the need for me to write a custom JSON converter for this particular issue.

Mike
  • 827
  • 11
  • 27