Weird behaviour in Decimal

Decimal(integerLiteral: 15123122) == Decimal(string: "15123122")! returns false. What's going on here?

It turns out that somehow the amount of precision retained for the two cases is different:

NSDecimalNumber(decimal: Decimal(15123122)).doubleValue
$R5: Double = 15123121.999999994

NSDecimalNumber(decimal: Decimal(string: "15123122")!).doubleValue
$R4: Double = 15123122

But I don't really understand why. The documentation for this datatype is, sadly, very minimal: https://developer.apple.com/documentation/foundation/decimal

1 Like

That is indeed strange:

print(Decimal(15123122)) // 15123121.999999997952

Even if init(_ value: Int) for some reason uses an intermediate Double, this would not explain the precision loss, since 15123122 is exactly representable as a 64-bit binary floating point number.

Btw, there is a feature request to initialize a Decimal precisely from a literal: SR-3317 Literal protocol for decimal literals should support precise decimal accuracy (closed as a duplicate of SR-920 Re-design builtin compiler protocols for literal convertible types).

The behavior is different on Linux (Ubuntu 16.04, development snapshot from 8 June 2018):

Welcome to Swift version 4.2-dev (LLVM 031e148970, Clang b58a7ad218, Swift 4b9373622f). Type :help for assistance.
  1> import Foundation
  2> print(Decimal(15123122)) 
15123122
  3>  
  4> Decimal(integerLiteral: 15123122) == Decimal(string: "15123122")!
$R0: Bool = true
  5>

Have you had any chance to test the same swift version on macOS? I just want to understand whether this is a platform or a Swift version issue.

On macOS, I get the same “precision loss” that you observed with both Xcode 9.4.1 and Xcode 10 beta 2. I assume that it is a platform issue: On Apple platforms, NSDecimalNumber is part of the system's Foundation framework, on other platforms it is part of https://github.com/apple/swift-corelibs-foundation.

Decimal was fixed in swift-corelibs-foundation to not use double when initialising from integers (see https://github.com/apple/swift-corelibs-foundation/pull/1588) although this is for the non-macOS version of Foundation

return true on macOS Big Sur, MacBook Air (M1, 2020) Xcode 13 Swift 5.5

Monterey 12.0.1 (21A558), Xcode 13.1 Playground:

import Foundation

Decimal(integerLiteral: 15123122) == Decimal(string: "15123122")    // true

let n = 15123122
let a = Decimal(n)  // works
let b = Decimal(exactly: n)       // error: Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0).
let c = Decimal(exactly: 15123122) // error: Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0).

why Decimal(exactly:) crash?

Terms of Service

Privacy Policy

Cookie Policy