Floating point precision issue when decoding json value

When working with decimals encoding a value of e.g 68.32 does not result in the same value when decoding it afterwards.

I'm using the suggestion from Decimal has no rounded() - #12 by LinusU to round the value before encoding it.

Might this be a bug in the JSONEncoder / JSONDecoder? Is there any reliable workaround?

Here is an example code:

import Foundation

let decimalValue: Decimal = 68.32 // 68.31999999999998976
let roundedDecimalValue = decimalValue.rounded(2, .plain) // 68.32

do {
    let data = try JSONEncoder().encode(roundedDecimalValue)
    let decodedDecimal = try JSONDecoder().decode(Decimal.self, from: data) // 68.31999999999999
    print(decodedDecimal)
} catch {}

extension Decimal {
    mutating func round(_ scale: Int, _ roundingMode: NSDecimalNumber.RoundingMode) {
        var localCopy = self
        NSDecimalRound(&self, &localCopy, scale, roundingMode)
    }

    func rounded(_ scale: Int, _ roundingMode: NSDecimalNumber.RoundingMode) -> Decimal {
        var result = Decimal()
        var localCopy = self // 68.31999999999998976
        NSDecimalRound(&result, &localCopy, scale, roundingMode)
        return result // 68.32
    }
}
1 Like

I suspect this is because Foundation.Decimal is actually sort of terrible: it interprets floating-point literals as Swift.Double during initialization, meaning it can’t be initialized with literals that aren’t representable as Double.

Personally, I wish they didn’t declare that conformance to ExpressibleByFloatLiteral at all: it’s quite misleading. It’s not the only issue with Foundation.Decimal, of course: once Swift Numerics adds a replacement, I would like to see everyone switch to that.

1 Like

This is a long-standing issue, SR-7054.

2 Likes

Note that this is less a problem with Decimal and more a problem with floating-point literals (and more precisely the lack of a decimal literal). Which isn't to say that it isn't a problem, but rather that we shouldn't blame Foundation for it. =)

4 Likes

Of course: I’m only blaming Foundation for making Decimal conform to ExpressibleByFloatLiteral in the first place, which obscures the issue.

Foundation in general doesn’t have problems that could be easily solved. It has problems, certainly, but they all stem from predating language features or needing to support legacy behavior.

The actual underlying issue here has nothing to do with Decimal at all. It's because JSONDecoder uses JSONSerialization under the hood to decode JSON, and JSONSerialization does so eagerly. At decode time, JSONSerialization has no idea what type might be "desired" on the receiving end, and prefers to decode Double values where possible. When JSONDecoder receives the value to decode, it's already a Double (at which point the "damage" has been done), so even if Decimal were perfect, it's already too late.

The issue that @lukasa linked above has all of the [long-standing] specific details.


There are solutions to this (e.g., JSONSerialization could theoretically vend an NSNumber subclass which contains both the parsed numeric value and the underlying string data, which JSONDecoder could use based on the actual type being requested), but it'd be up to the Foundation team to decide on a direction and implement it.

4 Likes

I was addressing this part:

You’re right, though, that wasn’t technically what was asked. The issue is extremely similar, in my defense.

I imagine other Decoder implementations might do better. You don’t have to use Foundation’s, after all. It’s not strictly in violation of the JSON spec, either.

What is the best, i.e. most correct, workaround for the JSONSerialization problem then?

When we are working with real money and get a JSON response back, e.g.

{
  "amount": 68.32
}

and would need to authorize that exact amount, lets say with Apple Pay, how would one decode that message so that there is no floating point precision error whatsoever? Changing the response from number to a string is not an option.

Writing a custom numeric encoder/decoder?

If you have to keep it a number, is transmitting the number of cents instead of fractional dollars/euro/whatever an option for you?

2 Likes

And if not, I suspect that continuing to round to scale in this way is likely a reasonable solution in the meantime. @scanon — numerically, do you know if it's possible to lose precision when reading this way (within the numerical range of dollar amounts) such that rounding in this way would not produce the same result as the string representation of the number? (My gut says "no", but would be interested if this is the case.)

Unfortunately not. We're just one of many API consumers, so it wouldn't be changed only because the iOS/Swift implementation is running into a problem.

I’m sure you could find an existing one that does the trick, though I can’t point to any in particular.

It is worth noting that KeyedDecodingContainer is not making this any easier.

1 Like

FYI: We've ended up implementing a solution based on decodable - Swift: Decode imprecise decimal correctly - Stack Overflow

In general, I have not had good experiences with the default JSONDecoder, as I have detailed here. My recommendation continues to be to implement it directly, and control your own parsing.