I dug around for prior discussion here but I couldn't find any beyond the original thread that introduced the integer protocols back in 2017.
In swift-protobuf, I had this harmless assertion to verify that a UInt64 value that I decoded was actually in the range of a UInt32:
assert(theUInt64 < 1 &<< 32)
This was great, until someone ran it on an armv7k Apple Watch device. I didn't understand why this would fail at first; I expected that the since the left-hand side of the comparison was explicitly a UInt64, the constraint solver would percolate that through to the right-hand side and infer it also as UInt64.
Unfortunately, that's not what happens. Between these two overloads:
The constraint solver ends up inferring 1 &<< 32 to be of type Int, and then it chooses the second one—the heterogeneous generic operator with substitutions (UInt64, Int) -> Bool. And since 1 &<< 32 == 0 for 32-bit Int, assertion go boom
It's an easy enough fix, but more importantly:
Is it the expected behavior that the compiler is preferring the generic heterogeneous operator over the concrete homogeneous one? At a surface level, we know that the compiler is supposed to prefer concrete overloads to generic ones, but constraint evaluation order can visit different paths that make that less straightforward.
Would this outcome be obvious to anyone who isn't a compiler expert?
Would we consider it a bug, and if so, is there a path to fixing it that isn't potentially source-breaking or runtime-breaking?
I think this is expected by virtue of the fact that selecting &<<: (Int, Int) -> Int gives us no non-default literals (compared to the &<<: (UInt64, UInt64) -> UInt64 overload), so that selection is automatically a 'better' solution before we even get around to comparing overload sets. Looking at the -debug-constraints output I don't even see an attempt to evaluate the UInt64 overload, but maybe it's getting eagerly discarded on some optimization because we find the Int overload first... normally I'd at least expect to see output there where the overload was considered briefly and then thrown out when the non-default literal score got increased.
ETA: perhaps implicitly answered by my response but...
No, I don't think so. The non-default literal scoring has always bothered me a bit. I realize there are cases where it's necessary to make certain expressions unambiguous, but I think it also admits a lot of unintuitive cases like this where we end up preferring a solution/overload set which ends up requiring type conversions and selection of generic overloads instead of the single-typed solution.
There have been a few iterations of this problem that have popped up over the years; each time it's been noticed for concretely typed operations we've added (or added back) overloads to try to 'correct' the behavior. Here's one involving != that I filed not so recently:
I'm pretty sure that the expression you show hasn't always favored heterogeneous comparison. If I'm right, then either overload resolution has been slightly reworked again or an overload has been dropped, maybe exacerbated by 32-bit platforms being perhaps less well tested over the years.
With the bit shifting operators specifically, it was always considered that a heterogeneous RHS operand was totally fine since there's nothing semantically about how much you shift the LHS that prefers homogeneity. My expectation would be that &<<(_: UInt64, _: Int) -> UInt64 would be invoked here as the preferred overload, permitting homogeneous comparison.
Yeah, I can no longer get Godbolt to execute Swift 4.x code in order to determine if it's a regression, but it's almost certainly an oversight somehow related to &<< (and likely &>>), since << (still) behaves as expected:
let x = (0 as Int32) == 1 << 32
let y = (1 as Int32) == 1 &<< 32
let x_ = (0 as Int32) == (1 as Int32) << 32
let y_ = (1 as Int32) == (1 as Int32) &<< 32
print(x, y, x_, y_) // true false true true
The difference, as far as I can tell, is that only &<< is defined on our concrete integer types—for << we inherit the generic implementation via BinaryInteger. In -debug-constraints, only the following overloads are found for <<:
> $T2 bound to decl Swift.(file).BinaryInteger.<< : <Self, RHS where Self : BinaryInteger, RHS : BinaryInteger> (Self.Type) -> (Self, RHS) -> Self
> $T2 bound to decl Swift.(file).BinaryInteger extension.<< : <Self, RHS where Self : BinaryInteger, RHS : BinaryInteger> (Self.Type) -> (Self, RHS) -> Self
> $T2 bound to decl Swift.(file).FixedWidthInteger extension.<< : <Self, Other where Self : FixedWidthInteger, Other : BinaryInteger> (Self.Type) -> (Self, Other) -> Self
SwiftFiddle has Linux builds going as far back as 2.2, and it looks like this outputted true false true true in 4.0 as well: SwiftFiddle - Swift Online Playground