I think it should be possible to initialize an NSDecimalNumber
or Decimal
from a literal without using a String
.
I know that this is probably well known but to get the benefits of a decimal number you need to use either a String
initializer or the one where you specify mantissa and exponent.
let literal: NSDecimalNumber = 3.1415926536
// 3.141592653600002
let literalDecimal: Decimal = 3.1415926536
// 3.141592653600000512
let initializer = NSDecimalNumber(value: 3.1415926536)
// 3.141592653600002
// Only those are exact
let exact1 = NSDecimalNumber(mantissa: 31415926536, exponent: -10, isNegative: false)
// 3.1415926536
let exact2 = NSDecimalNumber(string: "3.1415926536")
// 3.1415926536
Now that there is a discussion about literal initialization (Literal initialization via coercion) I want to ask if there is any particular reason we can't initialize a decimal number without the String
literal?
Can't we just have:
let literal: NSDecimalNumber = 3.1415926536
// exactly 3.1415926536
Am I missing something? The type checker should easily infer the correct type for the literal? Or does it not know what a decimal number is?