error: binary operator '^' cannot be applied to operands of type 'UInt16' and 'Int'
let sign:UInt16 = bitPattern &>> (count &- 1) ^ 1
~~~~~~~~~~~~~~~~~~~~~~~~~~~ ^ ~
note: expected an argument list of type '(UInt16, UInt16)'
let sign:UInt16 = bitPattern &>> (count &- 1) ^ 1
let bitPattern: UInt16 = 0
let count: Int = 1
let sign: UInt16 = bitPattern &>> (count &- 2) ^ 3
Strangely, changing the 2 to Int(2) allows it to compile, as does removing the subtraction bitPattern &>> count ^ 1 (in addition the "obvious" one of UInt16(3) that the error message is actually talking about). Could you file a bug?
(For future reference, it's really good to provide examples that completely demonstrate a problem, so that people looking at it don't have to guess/fill in the gaps to try to make it compile (or not compile in the right way, in this case); this one was simple enough, but making it as easy as possible for people to help is great.)
I commented on the bug, but this looks like a problem with diagnostics, not inference. There are no heterogeneous masking shifts, and by the way the precedence here is actually:
It looks like there is a heterogeneous masking shift on FixedWidthInteger. So perhaps there is something more going on here than diagnostics, because we're not picking that up and inferring the literal type.
I updated the bug to that effect. Someone will need to take a closer look at it.