I suspect this is because Foundation.Decimal is actually sort of terrible: it interprets floating-point literals as Swift.Double during initialization, meaning it can’t be initialized with literals that aren’t representable as Double.
Personally, I wish they didn’t declare that conformance to ExpressibleByFloatLiteral at all: it’s quite misleading. It’s not the only issue with Foundation.Decimal, of course: once Swift Numerics adds a replacement, I would like to see everyone switch to that.
Note that this is less a problem with Decimal and more a problem with floating-point literals (and more precisely the lack of a decimal literal). Which isn't to say that it isn't a problem, but rather that we shouldn't blame Foundation for it. =)
Of course: I’m only blaming Foundation for making Decimal conform to ExpressibleByFloatLiteral in the first place, which obscures the issue.
Foundation in general doesn’t have problems that could be easily solved. It has problems, certainly, but they all stem from predating language features or needing to support legacy behavior.
The actual underlying issue here has nothing to do with Decimal at all. It's because JSONDecoder uses JSONSerialization under the hood to decode JSON, and JSONSerialization does so eagerly. At decode time, JSONSerialization has no idea what type might be "desired" on the receiving end, and prefers to decode Double values where possible. When JSONDecoder receives the value to decode, it's already a Double (at which point the "damage" has been done), so even if Decimal were perfect, it's already too late.
The issue that @lukasa linked above has all of the [long-standing] specific details.
There are solutions to this (e.g., JSONSerialization could theoretically vend an NSNumber subclass which contains both the parsed numeric value and the underlying string data, which JSONDecoder could use based on the actual type being requested), but it'd be up to the Foundation team to decide on a direction and implement it.
You’re right, though, that wasn’t technically what was asked. The issue is extremely similar, in my defense.
I imagine other Decoder implementations might do better. You don’t have to use Foundation’s, after all. It’s not strictly in violation of the JSON spec, either.
What is the best, i.e. most correct, workaround for the JSONSerialization problem then?
When we are working with real money and get a JSON response back, e.g.
{
"amount": 68.32
}
and would need to authorize that exact amount, lets say with Apple Pay, how would one decode that message so that there is no floating point precision error whatsoever? Changing the response from number to a string is not an option.
And if not, I suspect that continuing to round to scale in this way is likely a reasonable solution in the meantime. @scanon — numerically, do you know if it's possible to lose precision when reading this way (within the numerical range of dollar amounts) such that rounding in this way would not produce the same result as the string representation of the number? (My gut says "no", but would be interested if this is the case.)
Unfortunately not. We're just one of many API consumers, so it wouldn't be changed only because the iOS/Swift implementation is running into a problem.
In general, I have not had good experiences with the default JSONDecoder, as I have detailed here. My recommendation continues to be to implement it directly, and control your own parsing.