I'm not sure this totally justifies things—after all, AnyHashable
does not attempt to recover this behavior for arbitrary BinaryInteger
types:
struct S: BinaryInteger {
var wrapped: Int
// details omitted...
}
0 as Int == 0 as S // true
AnyHashable(0 as Int) == AnyHashable(0 as S) // false
Of course, we can restore this by conforming the 'secret' _HasCustomAnyHashableRepresentation
protocol:
extension S: _HasCustomAnyHashableRepresentation {
public func _toCustomAnyHashable() -> AnyHashable? {
return wrapped._toCustomAnyHashable()
}
}
AnyHashable(0 as Int) == AnyHashable(0 as S) // true
But based on the implementation of _toCustomAnyHashable()
for integer types it seems clear to me that the goal there is to match NSNumber
semantics, not to attempt to provide a canonical representation based on BinaryInteger
equality.