Hi there,

I have an issue with a number conversion that appears to go completely wrong.

The issue can be recreated as follows:

```
import Foundation
var decimal = Decimal(_exponent: -16, _length: 4, _isNegative: 0, _isCompact: 0, _reserved: 0, _mantissa: (65023, 35303, 8964, 35527, 0, 0, 0, 0))
var number = NSDecimalNumber(decimal: decimal)
number
number.doubleValue
number.int64Value
var cfnum = number as CFNumber
var value: Int64 = 0
_ = CFNumberGetValue(cfnum, CFNumberType.longLongType, &value)
value
```

In a Playground (in Xcode 10.3), this gives the following results:

```
999.9999999999999487
1000
1000
1000
-844
1000
0
true
-844
```

The issue is naturally, that I don't think that 999.9999999999999487 as an int64 should be -844.

I think that the issue has to do with bridging. The CFNumber code is added to show how .int64Value is implemented - and it gives the same result.

My guess is that part of the value is interpreted as a sign when converted, but I have dug around in swift-corelibs-foundation and in CoreFoundation without being able to find out exactly what is going wrong.

Can anyone figure out it the above output is intended - or if there is indeed a bug here?

Sincerely,

/morten