Why are SIMD types not Hashable?

Two questions:

I noticed that eg int2 (the simd type) is Equatable but not Hashable. Is there any particular reason why it shouldn't also be Hashable?

I want to use it as Key in a Dictionary (used as part of a spatial indexing data type), and thus I had to:

extension int2 : Hashable {
    public var hashValue: Int {
        return unsafeBitCast(self, to: Int.self)
    }
}

I suppose that is OK (since both Int and int2 are 8 bytes)?

This is a specific case (and I do not think that's normal).

What is a specific case of what?

Int is not always 64 bits, so your bit cast would be problematic.

True (but for this particular use case, only 64 bit architectures are supported).

But as to the first question, any idea why the simd vector types shouldn't be Hashable by default (just as they are Equatable)?

The simd overlay implements Equatable by calling simd_equal. There isn't such an obvious built-in way to implement Hashable.

Please note the inconsistency of single element simd vector being Hashable while the rest is not, eg:
simd_uint1 <-- Is Hashable (via UInt32) while
simd_uint2 <-- Is not Hashable
simd_uint3 <-- Is not Hashable
simd_uint4 <-- Is not Hashable

The following way to make all simd vector types - and not only the single element ones - conform to Hashable seems straight forward IMHO, please let me know if/why the stdlib shouldn't do something similar:

extension simd_uint2 : Hashable {
    public var hashValue: Int { return unsafeBitCast(self, to: UInt64.self).hashValue }
}
extension simd_uint3 : Hashable {
    public var hashValue: Int { return unsafeBitCast(self, to: DoubleWidth<UInt64>.self).hashValue }
}
extension simd_uint4 : Hashable {
    public var hashValue: Int { return unsafeBitCast(self, to: DoubleWidth<UInt64>.self).hashValue }
}
extension simd_float2 : Hashable {
    public var hashValue: Int { return unsafeBitCast(self, to: UInt64.self).hashValue }
}
extension simd_float3 : Hashable {
    public var hashValue: Int { return unsafeBitCast(self, to: DoubleWidth<UInt64>.self).hashValue }
}
extension simd_float4 : Hashable {
    public var hashValue: Int { return unsafeBitCast(self, to: DoubleWidth<UInt64>.self).hashValue }
}
extension simd_double2 : Hashable {
    public var hashValue: Int { return unsafeBitCast(self, to: DoubleWidth<UInt64>.self).hashValue }
}
extension simd_double3 : Hashable {
    public var hashValue: Int { return unsafeBitCast(self, to: DoubleWidth<DoubleWidth<UInt64>>.self).hashValue }
}
extension simd_double4 : Hashable {
    public var hashValue: Int { return unsafeBitCast(self, to: DoubleWidth<DoubleWidth<UInt64>>.self).hashValue }
}

This code compiles only with recent developer snapshots (as eg Xcode 9.2 default toolchain doesn't have DoubleWidth).

EDIT:
I later realized that the above will be incorrect because for Floats, 0.0 == -0.0, so the hashValue for -0.0 and 0.0 need to be the same. Ie for any Float f, f.hashValue == Int(truncatingIfNeeded: f.bitPattern) except for f = -0.0.

simd isn't part of the Swift standard library. Perhaps this is more a question for Apple?

Correct, the simd types are not part of the stdlib and thus I was wrong when I wrote "stdlib" above, I meant it as a question towards Apple, and perhaps the Swift community, regarding the design and philosophy of support for the SIMD vector types, as can be seen in eg:
https://github.com/apple/swift/blob/master/stdlib/public/SDK/simd/simd.swift.gyb

It's very interesting. I discover property of SIMD type, it can be members of Sets even!

    let one: SIMD2<Double> = [0.02, 0]
    let two: SIMD2<Double> = [0.02, 0]
    let n = one == two
    print(n) // true

And we could compare Doubles in their vector representation