Decimal(integerLiteral: 15123122) == Decimal(string: "15123122")! returns false. What's going on here?
It turns out that somehow the amount of precision retained for the two cases is different:
NSDecimalNumber(decimal: Decimal(15123122)).doubleValue
$R5: Double = 15123121.999999994
NSDecimalNumber(decimal: Decimal(string: "15123122")!).doubleValue
$R4: Double = 15123122
But I don't really understand why. The documentation for this datatype is, sadly, very minimal: Apple Developer Documentation
1 Like
Martin
(Martin R)
2
That is indeed strange:
print(Decimal(15123122)) // 15123121.999999997952
Even if init(_ value: Int) for some reason uses an intermediate Double, this would not explain the precision loss, since 15123122 is exactly representable as a 64-bit binary floating point number.
Btw, there is a feature request to initialize a Decimal precisely from a literal: SR-3317 Literal protocol for decimal literals should support precise decimal accuracy (closed as a duplicate of SR-920 Re-design builtin compiler protocols for literal convertible types).
Martin
(Martin R)
3
The behavior is different on Linux (Ubuntu 16.04, development snapshot from 8 June 2018):
Welcome to Swift version 4.2-dev (LLVM 031e148970, Clang b58a7ad218, Swift 4b9373622f). Type :help for assistance.
1> import Foundation
2> print(Decimal(15123122))
15123122
3>
4> Decimal(integerLiteral: 15123122) == Decimal(string: "15123122")!
$R0: Bool = true
5>
Have you had any chance to test the same swift version on macOS? I just want to understand whether this is a platform or a Swift version issue.
Martin
(Martin R)
5
On macOS, I get the same “precision loss” that you observed with both Xcode 9.4.1 and Xcode 10 beta 2. I assume that it is a platform issue: On Apple platforms, NSDecimalNumber is part of the system's Foundation framework, on other platforms it is part of GitHub - apple/swift-corelibs-foundation: The Foundation Project, providing core utilities, internationalization, and OS independence.
spevans
(Simon Evans)
6
Decimal was fixed in swift-corelibs-foundation to not use double when initialising from integers (see [4.2] SR-7650: Incorrect result adding and subtracting Decimals by spevans · Pull Request #1588 · apple/swift-corelibs-foundation · GitHub) although this is for the non-macOS version of Foundation
return true on macOS Big Sur, MacBook Air (M1, 2020) Xcode 13 Swift 5.5
young
(rtSwift)
8
Monterey 12.0.1 (21A558), Xcode 13.1 Playground:
import Foundation
Decimal(integerLiteral: 15123122) == Decimal(string: "15123122") // true
let n = 15123122
let a = Decimal(n) // works
let b = Decimal(exactly: n) // error: Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0).
let c = Decimal(exactly: 15123122) // error: Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0).
why Decimal(exactly:) crash?