How does `associatedtype` inference work

I've been trying to understand the inferred A.Foo below:

enum Container<T> { }

protocol Chain {
    associatedtype Next: Chain
    associatedtype Foo = Int // #1
    associatedtype Bar
}

typealias _Foo<Next: Chain> = Container<Next.Foo>

/// #2
extension Chain where Next.Foo == Float, Next.Bar == Float {
    typealias Foo = Float
}

enum EndChain: Chain {
    typealias Next = Self
    typealias Foo = Float
    typealias Bar = Double
}
enum A: Chain {
    typealias Next = EndChain
    typealias Bar = Double
}

func foo<T: Chain>(_: T.Type) -> String {
    "[ \(T.Next.Foo.self), \(T.Next.Bar.self) ] -> \(T.Foo.self)"
}
foo(A.self) // "" [ Float, Double ] -> Float

Since A.Next.Bar (EndChain.Bar) isn't Float, it shouldn't try to use declaration #2 and use #1. But somehow the compiler still infers A.Foo to be Float despite the mismatch? Is this a bug? More importantly, is there a way to get Int (#1) when the condition isn't met?

Edit:

I've tried both Swift 5.2.4 on Xcode 11.5 (11E608c) and iOS Playground (Swift 5.1).

I believe this is a known bug. See this thread: Should this program really compile (is the where clause ignored)?

For additional discussion about associatedtype inference, see also: What kind of magic behavior is this?

I believe the resident expert on the topic is @Douglas_Gregor

2 Likes

It might be interesting to note that the first demonstration program of that thread now fails to compile (as expected), with Swift 5.2.4 that is.

As shown here.
struct S<T> {
    var a : A
}
extension S where T == Never {
    typealias A = Int
}
let v = S<Int>(a: 123)
print(v)

$ swiftc --version
Apple Swift version 5.2.4 (swiftlang-1103.0.32.9 clang-1103.0.32.53)
Target: x86_64-apple-darwin19.4.0

$ swiftc test.swift 
test.swift:2:13: error: 'S<T>.A' (aka 'Int') requires the types 'T' and 'Never' be equivalent
    var a : A
            ^
test.swift:4:1: note: requirement specified as 'T' == 'Never' [with T = T]
extension S where T == Never {
^
test.swift:7:19: error: cannot convert value of type 'Int' to expected argument type '<<error type>>'
let v = S<Int>(a: 123)
                  ^~~

I haven't looked closely at the example of the OP, what Swift version are you using @Lantua ?

I've tried both Swift 5.2.4 on Xcode 11.5 (11E608c) and iOS Playground (Swift 5.1).

Ah, yes I get the same output as you when I run your example.

I noticed that I get the following error if I try to do this:

print(A.Foo.self) // ERROR: 'A.Foo' (aka 'Float') requires the types 'EndChain.Bar' (aka 'Double') and 'Float' be equivalent

I guess the following is a reduced demonstration of (at least part of) the same issue:

protocol P { associatedtype A = Bool }
extension P where Self: BinaryFloatingPoint { typealias A = Float }
extension String: P {}
print(String.A.self) // ERROR: Type 'String' does not conform to protocol 'BinaryFloatingPoint'

Commenting out
extension P where Self: ...
will make it compile.

But why shouldn't it compile as is (and print Bool)?


Also, this compiles and runs (with unexpected result, IMHO):

protocol P { associatedtype A = Bool }
extension P where Self: BinaryFloatingPoint { typealias A = Float }
extension String: P {}

func foo<T: P>(_: T.Type) { print(T.A.self) }
foo(String.self) // Prints Float ... What? Compiler thinks String conforms to BinaryFloatingPoint now?

This looks like it's related to SR-10158 et al, which @jrose and @Ben_Cohen says is correct behavior. I wonder if they think my examples here works as intended too?
Also cc @Slava_Pestov who AFAICS has fixed related bugs.

3 Likes

I don't think this is the same as SR-10158. In that bug, an associated type is inferred from a constrained extension's constraints; in these examples, it's inferred from a typealias within a constrained extension. I don't know why (1) doesn't compile or why (2) does (I agree that the behavior is wrong!), but if I recall correctly it's been a longstanding weirdness that associated type checking considers typealiases in protocol extensions without looking at constraints, which is…bizarre, but related to type lookup not doing overload resolution:

protocol P {}
extension P where Self: AdditiveArithmetic { typealias A = Bool }
extension P where Self: BinaryFloatingPoint { typealias A = Float }
extension Float: P {}

let x: Float.A? = nil // should be the same as 'Float?' rather than 'Bool?', right?

But I haven't worked on this part of the compiler, so someone else should really answer.

EDIT: Oh, and my conclusion is that no one should use typealiases in constrained protocol extensions until this gets worked out, i.e. until there's an actual design. But that's just my opinion.

4 Likes

Now I'm thinking that maybe it shouldn't compile, for the same reasons that this doesn't:

protocol P {
  associatedtype A
}
extension P {
  typealias A = Bool
}
extension P where Self: BinaryFloatingPoint {
  typealias A = Float // ERROR: Invalid redeclaration of 'A'
}

Assuming the above is behaving as intended and is equivalent to the following:

protocol P {
  associatedtype A = Bool
}
extension P where Self: BinaryFloatingPoint {
  typealias A = Float // NOTE: This currently compiles but should maybe be same error?
}

So maybe this boils down to the question of whether

protocol P { associatedtype A }
extension P { typealias A = Bool }

is (intended to be) equivalent to

protocol P { associatedtype A = Bool }

or not.


But, as it turns out, I found a way to make it work as I originally expected, and I believe the same "trick" can be used to make something like the example in the OP work as originally expected too (though I agree with @jrose that we can't rely on any of this until it gets properly sorted out).

Trick that makes type level either/choice/optional-thing it work
protocol P {
  associatedtype A
  static var foo: A { get }
}

extension P {
  static var foo: Bool { return true }
}

extension P where Self: BinaryFloatingPoint {
  static var foo: Float { return 1.23 }
}

extension String: P {}
extension Double: P {}

func bar<T: P>(_: T.Type) {
  print(T.A.self)
}

print(String.A.self) // Bool
print(Double.A.self) // Float
bar(String.self) // Bool
bar(Double.self) // Float
1 Like

At a base level, those two are not equivalent. associatedtype A = Bool declares that the protocol has an associated type and that it defaults to Bool in the absence of any other information. associatedtype A + typealias A = Bool declares an associated type and additionally an unconditional typealias on the same protocol, with no indication of which one would shadow the other. It should probably be an error.

3 Likes

I’ve looked at your workaround now, and that should be reasonable, even with the unfortunate extra static member. That’s pretty much what associated type inference is designed for; it’s just too bad there’s no way to invoke it without a non-type member.

Thanks, what I still don't understand though, is the following behavior, ie how to answer questions [1] and [2] in this program:

protocol P {
  associatedtype A = Bool
}
extension P where Self: BinaryFloatingPoint {
  typealias A = Float
}

func foo<T: P>(_: T.Type) {
  print(T.A.self)
}

extension Double: P {}
extension String: P {}

foo(Double.self) // Prints Float
foo(String.self) // Prints Float … [1]: Why/how?

print(Double.A.self) // Prints Float
//print(String.A.self) // Doesn't compile … [2]: Why not? (Error msg makes no sense to me.)

I just can't understand the current behavior, I would expect this program to compile and print:

Float
Bool
Float
Bool

You've already addressed this upthread and said the current behavior is wrong:

But I think this program exemplifies what most would perceive as basic/normal usage of Swift generics, ie this does not look very complicated:

protocol P {
  associatedtype A = Bool
}
extension P where Self: BinaryFloatingPoint {
  typealias A = Float
}

and I think most people would agree on what they think it means. So the current behavior / bug(s) seem kind of serious IMHO.

Does your "until there's an actual design" mean that there is no intended behavior for this example program?

I created this thread in case there isn't, to see if people have the same expectations or not.

I don't think I'm an authoritative source anymore, but yes, that was my intent.

I'm not 100% convinced we should support your proposed interpretation of the not-obviously-conflicting form, but I do agree that it should not silently compile and then do the confusing thing it does today. I'm not sure what the path forward for that is—like changing the expression type checker, changing type lookup and associated type inference without changing the meaning of existing programs can be extremely difficult. But that doesn't mean it's not worth trying!

3 Likes