How does this `[Int].reduce(_:_:)` call work?

My understanding after reading reduce(_:_:) | Apple Developer Documentation is that, in the first iteration, the initial value (1) is passed in and it is supposed to do 1 == 1 after which I should get a Bool (in this case 1 == 1 so true) and on the next iteration of reduce it should do true == 1 which is not possible and should have errored out at compile time, yet the code runs fine and I somehow get a value of false.

What is the explanation for this?

print([1, 1, 1].reduce(1, ==))  // false


So I tested another variant and printed out the types at runtime.

print(type(of: [1, 1, 1].reduce(true, ==)))  // Bool
print(type(of: [1, 1, 1].reduce(1, ==)))  // AnyHashable

I don't understand how print(type(of: [1, 1, 1].reduce(true, ==))) is a Bool. Shouldn't it also be AnyHashable then? Assuming it is performing AnyHashable(true) == AnyHashable(1)?


I don't have a complete answer but ... it is indeed upcasting to AnyHashable.

The standard library declares ==(AnyHashable, AnyHashable) -> Bool and that is what is being used here. Magically, that is implicitly converted to (AnyHashable, AnyHashable) -> AnyHashable. Why that happens, I'm not entirely sure. But I confirmed that is the case by copying and pasting the implementation of reduce from the standard library, and printing all the input variables in the body.

So, your sequence is seen as [AnyHashable(1), AnyHashable(1), AnyHashable(1)], the initial result input is AnyHashable(1), and your reduce function is treated as (AnyHashable, AnyHashable) -> AnyHashable.

While true == 1 would fail at compile time, AnyHashable == AnyHashable is valid and that's why this compiles. The == checks if the wrapped values are the same, and then checks for equality. If the wrapped values are not the same, it returns false.

Concretely, on round one, AnyHashable(1) == AnyHashable(1) which produces AnyHashable(true). On round two, AnyHashable(true) == AnyHashable(1) produces AnyHashable(false). Hence, you get AnyHashable(false).

So, why does your second example return Bool? Inspecting the argument types, in this case, mystic conversion magic takes the == as (Bool, AnyHashable) -> Bool. So you get Bool at the end.

I'd love for a compiler engineer to detail the rules for the AnyHashable function type transformations.


Thanks a lot for the detailed breakdown :slightly_smiling_face: not only did I get an explanation, but I also got to know a new debugging technique.


Interesting example, thank you.

Perhaps it is possible to explain compiler behaviour in each of the following sub-example, yet the result is not obvious upfront:

func test() {
    let a: [Int] = []
    let b: [Int: Int] = [:]
    let c: Set<Int> = []
    _ = a.reduce(true, ==) // βœ… this compiles
    _ = b.reduce(true, ==) // πŸ›‘ No exact matches in reference to operator function '=='
    _ = c.reduce(true, ==) // βœ… this compiles
    _ = [1].reduce(true) { r, e in
        return r == e // βœ… this compiles
    _ = [1].reduce(true) { r, e in
        return r == e // πŸ›‘ Binary operator '==' cannot be applied to operands of type 'Bool' and 'Int'
    // and this doesn't
    _ = ([1] as [Int]).reduce(true) { r, e in
        return r == e // πŸ›‘ Binary operator '==' cannot be applied to operands of type 'Bool' and 'Int'

I'd assume that the reason of this behaviour is that number conversion to boolean somewhat allowed operation in such context (e.g. due to protocol conformances), so for the first case it simply converts values of an array to booleans and performs equality operation, resulting in Bool as output. In second case the only common type it can use is, probably - fallback, type of AnyHashable wrapper (in other words, it "chooses" to perform type erasure instead on conversion).

Yet this a bit of an odd behaviour, I'd prefer it not to compile at all in such cases due to types mismatch instead of assumptions, but probably that's not a choice at all - this is a type system that works this way.

All of that, I think, easily validatable by a bit more exotic case:

struct MyInt: ExpressibleByIntegerLiteral, Comparable {
    let value: Int

    init(integerLiteral value: IntegerLiteralType) {
        self.value = value

    static func == (lhs: MyInt, rhs: MyInt) -> Bool {
        return lhs.value == rhs.value

    static func < (lhs: MyInt, rhs: MyInt) -> Bool {
        return lhs.value < rhs.value
let arr: [MyInt] = [1, 1, 1]
let value = arr.reduce(true, ==)  // error: no exact matches in reference to operator function '=='

Now it will raise an error, since there is no way to cast it to the boolean. I've tried to find a protocol that gives compiler an ability to deduce as it does for Int, but have failed so far.

UPD: Fixed to more correct example of custom type. And now if I'd add Hashable conformace to MyInt, it will now be able to wrap as AnyHashable. Which now downs simply to operators overload, as following post gave the idea why there is a bool - assuming the most concrete possible overload has been chosen.

1 Like

Overload resolution with implicit conversions such as AnyHashable doesn’t really have documented or even completely understood behavior, unfortunately.


What's confuses though, that this compiles...

_ = ([1] as [Int]).reduce(true, ==)

...while it is kinda expected to be the same?

1 Like