JSONEncoder / Encodable: Floating point rounding error

I am finding a floating point rounding error when using JSONEncoder to encode Doubles:

do {
    let encoder = JSONEncoder()
    let data = try encoder.encode(4.18)
    let encodedString = String(data: data2, encoding: .utf8)!
    XCTAssertEqual(encodedString, "4.18")
} catch {
    XCTFail(String(describing: error))
}

This produces the following failure:

XCTAssertEqual failed: ("4.1799999999999997") is not equal to ("4.18")

Does anyone know a good workaround for reliably encoding Doubles to JSON?

This isn’t an error, but the result of how floating point numbers are coded and stored in memory. Not all possible values are representable. If you need arbitrary precision numbers, and loss-less conversion, you need to use some other representation.

Perhaps Decimal works for you if you need numbers in base ten? However, note that you’ll have to encode them as strings in your json for the round trip to be lossless.

7 Likes

Makes sense. For the most part, I want to work with values as Doubles... but for the encode, I just did a manual implementation of encode and converted to Decimal for the JSON encode. Thanks!

But note that if you decode 4.18 back to a Double, the value will in fact be 4.17999999999999971578290569596 as it is the closest representable number. In fact,

let a = 4.17999999999999971578290569596
let b = 4.18
print(a == b) // true
print(a) // 4.18

The number 4.18 is not possible to represent exactly as a floating point number. The real value it holds is in fact that long unwieldily number. But you won't notice it, because most ways of printing the number will just display 4.18.

However,

let a = 4.18
print(a) // 4.18
print(String(format: "%.20f", a)) // 4.17999999999999971578

That is, when you explicitly say that you want to print the number to 20 decimal places in base ten, you'll see that the number is in fact not exactly 4.18.

7 Likes

In fact, the Double (IEEE-754 double precision floating point) value closest to 4.18 is exactly:

4.17999999999999971578290569595992565155029296875

: )

2 Likes

I'd suggest that you just use normally encoded Double. Essentially every serialization nowadays doesn't have higher precision than that, so you should be able to roundtrip safely without losing any precision (even if it'd look a little weird to human eyes).

Unless you're dealing with fixed precision number (and so the reader might have trouble parsing), in which case, you might just want to implement custom-precision type anyway.

3 Likes

It's also worth noting that the built-in encoder for Double will output enough decimal digits in the JSON representation to guarantee that it will come out as the exact same IEEE value when decoded later.

That is, the decimal number in the JSON might not be an exact representation of the floating point value you started with, but you can be sure it can't be mistaken for any other floating point value.

That's one reason why it doesn't need to be output as the longer-but-more-precise decimal number that @Jens quoted. The shorter decimal number identifies it just as uniquely.

7 Likes

And this is also how Double (et al) conform to LosslessStringConvertible.

3 Likes

Of course, Float has the same round-trip guarantee, which is not surprising.

It is worth noting, though, that converting a Float to a Double for JSON encoding "to keep more precision" is not a good idea. The decimal representation of the Double may be surprisingly unrelated to that of the Float.

I don't have an example handy, though it's been discussed in the Swift forums in the past, but you can have scenarios where the Float representation might be something like 0.741, and the Double representation might be 0.762…. (Not a real example, I just made up those numbers.)

It's counterintuitive but correct that the Double representation doesn't "round" to the Float representation.

4 Likes

Just one additional note to add to this — the current implementation (using [NS]JSONSerialization) will always output enough digits to guarantee round-trip equality, but there are cases where it is possible to use fewer digits while still maintaining that guarantee.

@tbkka implemented such an algorithm for Double.description way back in PR #15474 and the same could be done for JSONEncoder (the limiting factor is having to reimplement the algorithm in Foundation because JSONSerialization is Objective-C). This isn't harmful in any way, but would be a nice future change for producing "prettier" JSON. (See also SR-5961)

5 Likes

Actually, no porting should be needed: The implementation is in C in the low-level Swift runtime. So Foundation would just have to copy a single C file into their code.

3 Likes

Sorry, should've been clearer about "reimplement"; I primarily meant to imply that Foundation couldn't benefit from the Swift-level implementation as exposed through Double directly, but agreed!

This is not the right way to test this.

You should decode the result and verify that the double that comes back is in fact equal to 4.18. The string form may have more or fewer digits than you expect; what matters is that the double value is preserved. If not, please file a bug report.

3 Likes

In this case, the use case is storing a price... so in practical use, the decimal precision that I care about is always 2 (for display purposes). Having an accurate value is still important. For display purposes, I only ever care about 2 digits of precision.. for using in calculations, I care about having the most accurate number.

So, it just feels a bit odd to get a service call back with:

{
    "price": 4.17999999999999971578290569596
}

Fine that it works in round trip, but could confuse users of the API, who may not be re-encoding to Swift but in whatever language they are using.

So, the challenge is maintaining accuracy in the background while never exposing that to the user of the API.

Encoding as a string could make sense but doesn't match the existing schema, so stuck with storing as a number.... and I generally dislike when APIs return numbers as strings.

For currency you should generally use Decimal, though that type has its own issues.

3 Likes

I attempted to use Decimal instead of Double but ran into a few issues. It doesn't work nicely with String formatting. As far as I can tell, Decimal is bridged from Objective-C vs. being pure Swift. (I generally try to avoid bridged APIs when possible). For now, just doing a quick conversion to Decimal before it gets encoded.

I think Decimal would be ideal if it ever gets re-implemented as a 100% Swift-native API.

Yes, Decimal's String conversion is one of the issues I was alluding to. I have had many issues attempting to round trip from user entered currency Strings to Decimal to JSON and back. But yes, the standard library really needs a native Decimal type. I don't think there's much to justify avoiding bridged types though.

3 Likes

This seems to sync with what I've found in round-trip testing. If I convert to Decimal during the encode and generate: #"{"price": 4.18} "#... then decode, it matches the original Double value.

From what I can tell, it is.

FWIW... I tested this in JavaScript and it seems to work:

const testResponse = `{"price":4.1799999999999997}`; // "{\"price\":4.1799999999999997}"
const asObject = JSON.parse(testResponse); // {price: 4.18}

So, perhaps it's a non-issue. My only concern is that it might still look odd in Postman... though possible Postman also applies similar formatting in it's pretty-print view.