Failed to infer type of integer literal

Why isn’t Swift able to infer the type of the rightmost 1 as UInt16 in this expression?

func amplitude(count:Int, bitPattern:UInt16) -> Int16
{
    let sign:UInt16 = bitPattern &>> (count &- 1) ^ 1
    ...
}
error: binary operator '^' cannot be applied to operands of type 'UInt16' and 'Int'
    let sign:UInt16 = bitPattern &>> (count &- 1) ^ 1
                      ~~~~~~~~~~~~~~~~~~~~~~~~~~~ ^ ~
note: expected an argument list of type '(UInt16, UInt16)'
    let sign:UInt16 = bitPattern &>> (count &- 1) ^ 1

Complete example:

let bitPattern: UInt16 = 0
let count: Int = 1

let sign: UInt16 = bitPattern &>> (count &- 2) ^ 3

Strangely, changing the 2 to Int(2) allows it to compile, as does removing the subtraction bitPattern &>> count ^ 1 (in addition the "obvious" one of UInt16(3) that the error message is actually talking about). Could you file a bug?

(For future reference, it's really good to provide examples that completely demonstrate a problem, so that people looking at it don't have to guess/fill in the gaps to try to make it compile (or not compile in the right way, in this case); this one was simple enough, but making it as easy as possible for people to help is great.)

2 Likes

oh i completely forgot i had local variables in that i was focusing on the ^ 1 part lol

bug: 8002

2 Likes

I commented on the bug, but this looks like a problem with diagnostics, not inference. There are no heterogeneous masking shifts, and by the way the precedence here is actually:

(bitPattern &>> (count &- 2)) ^ 3

rather than

bitPattern &>> ((count &- 2) ^ 3)

i know that lol

Why does changing it to (3 as UInt16) fix it?

It looks like there is a heterogeneous masking shift on FixedWidthInteger. So perhaps there is something more going on here than diagnostics, because we're not picking that up and inferring the literal type.

I updated the bug to that effect. Someone will need to take a closer look at it.