SolvedNewtonsoft.Json Serialization of decimals does not respect precision


C# decimal value serialized to JSON and de-serialized back to decimal gives a number with different precision.


Decimals in .NET are tricky: besides the number itself, they store the number of digits necessary to represent it. For example, numbers 15 and 15.0 stored in decimal variable will be represented differently in memory, though considered equal in comparison. When we serialize and de-serialize numbers, it is important to keep this information.

Steps to reproduce

using Newtonsoft.Json;
using System;

namespace JsonDecimalIssue
    class Program
        static void Main(string[] args)
            decimal before = 15;
            string serialized = JsonConvert.SerializeObject(before); // produces "15.0" <- incorrect
            decimal after = JsonConvert.DeserializeObject<decimal>(serialized);

            Console.WriteLine(before); // Writes "15"
            Console.WriteLine(after);  // Writes "15.0"

Possible solution

The issue can be solved by keeping the necessary number of decimal digits in JSON representation of the number, e.g. serialize decimal 15 as integer "15", and decimal 15.0 as "15.0". This is exactly how Decimal.ToString() works. Then the number of digits can be respected when de-serializing back to decimal.

13 Answers

✔️Accepted Answer

I agree that this is unexpected behaviour at the very least, and imho it is also a bug. For 15, the precision is 2 and the scale is 0. For 15.0, the precision is 3 and the scale is 1. They're two different things.

I wholly understand that this could be a major breaking change, but can you please reconsider it?

Other Answers:

  1. I would say, this behavior is not expected unless you are aware of it. This is special processing of decimal data type, which an average user won't expect.

  2. When the expected behavior causes a problem, it is still a problem. The question is: can this expected behavior be changed?

You see, if a value becomes something else after we serialize and de-serialize it -- this is generally not good. Don't you agree? And I had a real problem with this, so it is not just my perfectionism.

Of course, the change is not desirable if it is potentially breaking. I do not know how much client code may relay on having .0 in serialized decimals... JSON follows JavaScript data types, and in JavaScript it is all the same data type: Number. When we de-serialize, we are guided by C# type definitions, so we do not need extra hint (like having .0) to distinguish integer numbers from floating point and decimals. So I do not see any negative impact of the change I proposed.

Can you please just think one more time about it?

Thank you for the hint about custom converter, I will try that.

More Issues: