What kind of magic behavior is this?

I'm not sure I follow that example, but looking at Indices on Collection I see that it's defaulted to DefaultIndices<Self> and that type has a similar extension extension DefaultIndices where Self.Element == String. By the logic of this magical behavior I can imply that if Self.Element was not provided by the implementor somehow, it can potentially infer Element being String. :exploding_head:

Even worse the compiler will likely to try to infer Element as String because of all these extensions which also are satisfied when Self.Element is String.

extension DefaultIndices where Self.Element : Equatable 
extension DefaultIndices where Self.Element : Sequence
extension DefaultIndices where Self.Element : Comparable
extension DefaultIndices where Self.Element : StringProtocol 
struct A: RandomAccessCollection {
  var startIndex: Int { return 0 }
  var endIndex: Int { return 0 }
  subscript(i: Int) -> Int { return 0 }
// RandomAccess also picks up default for index(after:) where Index == Int

print(type(of: A().indices))
// Range<Int>

struct B: Collection {
  var startIndex: String.Index { return "".startIndex }
  var endIndex: String.Index { return "".endIndex }
  func index(after: String.Index) -> String.Index { fatalError() }
  subscript(i: String.Index) -> Int { return 0 }

print(type(of: B().indices))
// DefaultIndices<B>

This might be a related post, I remember having a glance at. I will need to re-read it still, but it still might be interesting for the readers of this thread: [RFC] Associated type inference

1 Like

If DefaultIndices were the only thing driving the collection's index type, and you didn't have to implement any methods on Collection that otherwise set Index to be some other type, then yes, Index could conceivably be inferred via a default implementation of indices. Of course, that's not the case though so would never happen.

edit: Index type of the collection, not Element type

I understand that, but the current behavior makes it impossible to express that the user should only gain access to the extension 'only when his/her type does provide necessary constraints', and makes into 'if the user forgot to explicitly specify the associated type, it will default to what the conditional extension had as same type constraint' which makes it equivalent as if the default associated type was explicitly specified in the protocol requirements. Above I even provided a similar example where the default associated type was simply ignored because of this behavior.

To me that is clearly a miscommunication of contracts between the compiler and the developer.


This thread might be interesting (unless I glanced this thread too quickly and misunderstood the issue).

To sum up the original problem. I think the compiler should emit an error and not automatically satisfy the conformance to the protocol in such cases unless one of the following things are provided (tight to the original example):

  • S has an explicit typealias Something = Int
  • or S has implemented something as var something: Int, which would also allow S to access any other member of the extension Test where Something == Int (if there were any other members)

In any other case this is likely to be a source of bugs and it prevents the expressiveness of the language. That of course my personal opinion about this behavior.

There are long standing compiler bugs in this area.

For example the following (AFAICS clearly invalid) program still compiles with Swift 4.2 (haven't checked 5.0):

struct X<T> {
    var a : A
extension X where T == Never {
    typealias A = Int
let x = X<Int>(a: 123)

This comes down to believing associated type inference shouldn’t be a thing. The where clause part is just restricting when the inference can kick in. Being against associated type inference is fine, and given a do-over probably it would be on the cards, but it would be a radical change to the current behavior and would face (likely insurmountable) source compatibility challenges.

I’d also caution against using vague but hyperbolic phrases like “prevents the expressiveness of the language” when voicing concerns like these, because they don’t have much meaning. One person’s “this is dubious magic... it hurts expressively!” is another’s “this is a way of giving types functionality for free... it helps expressivity!”.

Only real usage examples showing help/harm can make the case. The indices example is a real world one where benefit is gained. It’s also one that can cause unexpected, maybe incorrect behavior... but let’s talk in concrete examples not “somethings”.


I'm not against the inference in general. All I'm trying to say is that it's acting way too aggressive in this case and should be restricted a little bit more. I understand that this is 'potentially' a source breaking change even if we would only forbid the inference in such cases. Still I would prefer it here as it makes the language (also the inference) more predictable. The example I provided is of course an oversimplification, but it's reduced to the point so that everyone who starts reading this thread can quickly follow it. I also understand that my 'Swift developer' word goes against the 'Swift compiler expert' word, that's also totally fine. Please don't get an impression as if I'm trying to hate here something. All I was trying to say in this thread up until now was that the inference behavior is too surprising and the compiler is doing something I do not expect it to do, as I think that I did not expressed that behavior in my code sample. I also would like to apologize to you or anyone who is reading this thread if you/they have hard time understanding me or the way I'm wring, as I'm not a native English speaker and it's already hard enough for me to write that much details in English. ;)

@Ben_Cohen: I believe my above and the following two example programs are related to the OP (by the conditional extension issue, which afaics must be a bug). Are their behavior as intended?

Program 1
protocol P {
    associatedtype Q
    var v: (D, E, F) { get }
extension P where Q == Int {
    typealias D = Int
extension P where Q == Bool {
    typealias E = Bool
extension P where Q == Float {
    typealias F = Float
struct R<T> : P {
    typealias Q = T
    var v: (D, E, F)
print(R<String>.D.self) // Int
print(R<String>.E.self) // Bool
print(R<String>.F.self) // Float
print(R<Double>.D.self) // Int
print(R<Double>.E.self) // Bool
print(R<Double>.F.self) // Float

Program 2
struct S<A> {
    var hmm: (A, B, C)
extension S where A == Bool {
    typealias B = A
    static func printB() { print(B.self) }
extension S where A == Float {
    typealias C = A
    static func printC() { print(C.self) }
let a = S(hmm: ("strange", true, Float(42.1)))
print(a) // S<String>(hmm: ("strange", true, 42.1))

I understand the first example at seems to be related, indeed. The second example however is really mind-bending. There are multiple issues at the same time. First of all the type should already error out as B and C are unknown types, but it somehow finds the type aliases from the extensions. Then the type alias always says that B or C are A. Going forward there is the issue of the original post where the where clause is completely ignored. Lastly what really blows my mind is that A is set to String but you can still have B as Bool and C as Float.

These examples throw other things into the mix, like relying on type aliases only stated in extensions. That's a strange feature and clearly has some potentially buggy or at least surprising behaviours, but it's not really relevant to this particular discussion.

Reduced examples are great for clearly showing what the behavior is. But they don't help to make a case for or against that behavior (except in cases where they show the behavior to be obviously confusing on its face, like Jens' examples).

To make a case that a feature is harmful, you need to show real-world usage that could cause harm.

Here is an example using indices that shows potential harm:

struct EveryOther<Base: Collection> where Base.Index == Int {
  let _base: Base

extension EveryOther: RandomAccessCollection {
  typealias Element = Base.Element
  typealias Index = Base.Index
  var startIndex: Index { return _base.startIndex}
  var endIndex: Index { return _base.endIndex }
  subscript(i: Index) -> Element { return _base[i] }

  func index(after i: Index) -> Index {
    return i < _base.endIndex-1 ? i + 2 : endIndex

let c = EveryOther(_base: 0..<10)
print(type(of: c.indices)) // Range<Int>... uh-oh

// prints every other element... 👍🏻
for e in c { print(e, terminator: ",") }
// prints every element... 🤬
for i in c.indices { print(c[i], terminator: ",") }

This example is somewhat contrived. There is no good reason for EveryOther to constrain to integers. But lots of people do that, so it's possible people hit this problem today. It is also an example that would show harm even without the associated type. index(offsetBy:) is busted too. These are the pitfalls of default behaviors on collections. I'm not sure associated types are special in this regard.


Thanks @Ben_Cohen for the additional information. I have started thinking about how I would ideally like to see the implementation of RandomAccessCollection work in this regard.

My first thought was that the extension should be constrained with respect to Index alone. That is, where Index: Strideable, Index.Stride == Int. However, that doesn’t quite work, because it would effectively have typealias Indices = Range<Index> in the extension, and thereby interfere with concrete types that use something else for Indices.

Essentially, RandomAccessCollection wants to say, “When Indices is undetermined, make it Range<Index> if possible and provide this default implementation.”

Conversely, in Adrian’s example, the Test protocol wants to say, “If and only if Something is determined to be Int, then provide this default implementation.”

Logically, RandomAccessCollection ought to provide a *constrained default value* for Indices. This is not currently possible, but it might look like this if we allowed it:

extension RandomAccessCollection
  where Index: Strideable,
  Index.Stride == Int
  // Provide a default value for the associated type
  default associatedtype Indices = Range<Index>

extension RandomAccessCollection
  where Index: Strideable,
  Index.Stride == Int,
  Indices == Range<Index>
  // Provide the default implementation
  var indices: Range<Index> { return startIndex ..< endIndex }

I don’t know if or how this would work “under the hood”, but from a programmer’s perspective it follows expectations much more naturally.


Just a small side not, the workaround for the original example would be to provide an extension with Never, then the compiler would no longer infer the type and generate the expected error message Type 'S' does not conform to protocol 'Test':

extension Test where Something == Never {
  var something: Something {

Thinking on it more, here is how my mental model of constrained extensions wants them to work when inferring associated types for a protocol conformance:

First consider everything unconstrained, and infer as many associated types as possible.

Then include any constrained extensions whose conditions are completely satisfied.

Repeat until nothing new is found, or an unresolvable conflict arises.

If all associated types have been determined, success. Otherwise, raise an error.


I do second this inference path of your mental model.

This is the opposite of the usual and desirable behavior of Swift, which is to choose the most specific case possible, not the least specific. Collections that could use Range as their Indices would get DefaultIndices instead. This would be both source-breaking and performance pessimizing compared to the current situation.


I think this shows us there is a missing gap in language that we need explore and fill which would allow us to express inference priority instead of relying on a mixed set of inference rules that can potentially lead to a wrong/unexpected result as you showed in your previous example.