func identify<T: ~Copyable>(type: T.Type) -> ObjectIdentifier {
ObjectIdentifier(type) // error: Argument type 'T.Type' expected to be an instance of a class or class-constrained type
}
I'm assuming this is a bug(...?) in how suppressions are interpreted by the type system that prevents it from satisfying ObjectIdentifier's init(_ x: any Any.Type) initializer. Is there a workaround?
Here's what you're working with—it's not ObjectIdentifier specific.
(any Copyable).self is any Any.Type // 'is' test is always true
(any ~Copyable).self is any Any.Type // Cast from '(any ~Copyable).Type' to unrelated type 'any Any.Type' always fails
as for workarounds... it seems that in the 6.1 compiler, force casting to any Any.Type before passing the value into the ObjectIdentifier init may work. it emits the (erroneous?) warning Danny mentioned above that the cast will 'always fail', but it doesn't actually appear to. not sure if that is a totally reliable solution or has any downside risks though.
Swift Testing currently does an unsafe bitcast here. In this context, this cast is safe, but I do like @mattcurtis' workaround which avoids even a whiff of "unsafe" here.
I cannot confirm what will actually ship in Swift 6.2. I do not know for sure -- nobody does. Things can and do get rolled back sometimes, as unforeseen issues can get uncovered during testing.
For what it's worth, the generalization of ObjectIdentifier's initializer is currently on track to ship in 6.2. It was proposed in SE-0465; that proposal got accepted; the change has been merged to the 6.2 release branch; and as of today, I am not aware of any problem that would require taking it out.
Heh, fair. I didn't mean to suggest otherwise. More just meant to confirm that the issue you addressed is the same as the issue described in this thread.