tkrajacic
(Thomas Krajacic)
1
I think it should be possible to initialize an NSDecimalNumber or Decimal from a literal without using a String.
I know that this is probably well known but to get the benefits of a decimal number you need to use either a String initializer or the one where you specify mantissa and exponent.
let literal: NSDecimalNumber = 3.1415926536
// 3.141592653600002
let literalDecimal: Decimal = 3.1415926536
// 3.141592653600000512
let initializer = NSDecimalNumber(value: 3.1415926536)
// 3.141592653600002
// Only those are exact
let exact1 = NSDecimalNumber(mantissa: 31415926536, exponent: -10, isNegative: false)
// 3.1415926536
let exact2 = NSDecimalNumber(string: "3.1415926536")
// 3.1415926536
Now that there is a discussion about literal initialization (Literal initialization via coercion) I want to ask if there is any particular reason we can't initialize a decimal number without the String literal?
Can't we just have:
let literal: NSDecimalNumber = 3.1415926536
// exactly 3.1415926536
Am I missing something? The type checker should easily infer the correct type for the literal? Or does it not know what a decimal number is?
5 Likes
The issue right now, as I remember it, is that you still end up going "through" a binary representation when transitioning from a literal to a decimal number. I am light on proof right now, but I think that that is the issue right now.
EDIT: Silly me. It says so right there on the tin. ExpressibleByFloatLiteral Float, here, implies a binary representation
4 Likes
Martin
(Martin R)
3
4 Likes
idrougge
(Iggy Drougge)
4
This has now gone on for so long that there is a proposal to add a Swiftlint rule to discourage use of decimal literals: Rule Request: discouraged_decimal_float_literal · Issue #2833 · realm/SwiftLint · GitHub
Literals that are resolved with less precision than both their textual representation and than is represntable by the target type is not just a small bug, it's a danger.
5 Likes