I have an issue with a number conversion that appears to go completely wrong.
The issue can be recreated as follows:
import Foundation
var decimal = Decimal(_exponent: -16, _length: 4, _isNegative: 0, _isCompact: 0, _reserved: 0, _mantissa: (65023, 35303, 8964, 35527, 0, 0, 0, 0))
var number = NSDecimalNumber(decimal: decimal)
number
number.doubleValue
number.int64Value
var cfnum = number as CFNumber
var value: Int64 = 0
_ = CFNumberGetValue(cfnum, CFNumberType.longLongType, &value)
value
In a Playground (in Xcode 10.3), this gives the following results:
The issue is naturally, that I don't think that 999.9999999999999487 as an int64 should be -844.
I think that the issue has to do with bridging. The CFNumber code is added to show how .int64Value is implemented - and it gives the same result.
My guess is that part of the value is interpreted as a sign when converted, but I have dug around in swift-corelibs-foundation and in CoreFoundation without being able to find out exactly what is going wrong.
Can anyone figure out it the above output is intended - or if there is indeed a bug here?
I've noticed too that Decimal / NSDecimalNumber is largely broken for it's intended use: arbitrary precision numbers, especially when parsing from String. We have our own String parsing instead.
Which runs the Linux-version of Swift. The error does not appear here, so I guess it is fixed in swift-corelibs-foundation but not in Darwin Foundation?
Does it make sense to create a 'radar' for this even though it has been reported several years back without any fixes?
I can confirm that is works correctly with Swift 5.1.1 on Linux:
Welcome to Swift version 5.1.1 (swift-5.1.1-RELEASE).
Type :help for assistance.
1> import Foundation
2> let dec = NSDecimalNumber(string:"999.9999999999999487")
...
3> dec.int64Value
$R0: Int64 = 999
As I understand it, filing radars makes always sense if the problem affects you.