Associated type inference fun with RandomAccessCollection

Hi all,

While working on the type checker, I came across an interesting case for associated type inference with the ‘Indices’ type of RandomAccessCollection. At issue is a simple model of RandomAccessCollection where the Index type is Int:

class ReferenceCollection : RandomAccessCollection {
  typealias Index = Int
  
  var startIndex: Int {
    return 0
  }

  var endIndex: Int {
    return 1
  }

  subscript(index: Int) -> String {
    return ""
  }

  func index(after i: Int) -> Int {
    return 1
  }

  func index(before i: Int) -> Int {
    return 0
  }
}

What’s the inferred associated Indices? The RandomAccessIterator protocol has a default:

protocol RandomAccessCollection {
    associatedtype Indices : _RandomAccessIndexable, BidirectionalCollection
      = DefaultRandomAccessIndices<Self>
    var indices: Indices { get }
}

which will kick in if nothing else can be inferred. There is also an implementation for this defaulted case in a protocol extension from which we can infer Indices:

extension RandomAccessCollection where Indices == DefaultRandomAccessIndices<Self> {
   public var indices: DefaultRandomAccessIndices<Self> { }
}

Those line up, which is easy, but there is *another* protocol extension of RandomAccessIterator from which we can infer Indices:

extension RandomAccessCollection
where Index : Strideable,
      Index.Stride == IndexDistance,
      Indices == CountableRange<Index> {

  public var indices: CountableRange<Index> {
    return startIndex..<endIndex
  }
}

Note that both DefaultRandomAccessIndices<ReferenceCollection> and CountableRange<Int> would be valid inferences for Indices. We have three options:

1) Consider type inference to be ambiguous, because there is no natural ordering between the two protocol extensions (they have incompatible same-type constraints on the associated type Indices).
2) Consider the first protocol extension to “win” because… we prefer the extension which corresponds to the associated type default (?). This would be consistent with a world where we don’t have associated type inference at all. (It also matches Swift 3.0.1’s behavior).
3) Consider the second protocol extension to “win” because…the other protocol extension corresponds to the associated type default, and could therefore be considered to be a lowest-common-denominator implementation only there to provide the most basic defaults.

For reference, Swift 3.0.1 picked DefaultRandomAccessIndices<ReferenceCollection>, current Swift master picks CountableRange<Int>, and my work-in-progress to improve the type checker calls it ambiguous, hence the question :)

  - Doug

Hi all,

While working on the type checker, I came across an interesting case for associated type inference with the ‘Indices’ type of RandomAccessCollection. At issue is a simple model of RandomAccessCollection where the Index type is Int:

class ReferenceCollection : RandomAccessCollection {
  typealias Index = Int
  
  var startIndex: Int {
    return 0
  }

  var endIndex: Int {
    return 1
  }

  subscript(index: Int) -> String {
    return ""
  }

  func index(after i: Int) -> Int {
    return 1
  }

  func index(before i: Int) -> Int {
    return 0
  }
}

What’s the inferred associated Indices? The RandomAccessIterator protocol has a default:

protocol RandomAccessCollection {
    associatedtype Indices : _RandomAccessIndexable, BidirectionalCollection
      = DefaultRandomAccessIndices<Self>
    var indices: Indices { get }
}

which will kick in if nothing else can be inferred. There is also an implementation for this defaulted case in a protocol extension from which we can infer Indices:

extension RandomAccessCollection where Indices == DefaultRandomAccessIndices<Self> {
   public var indices: DefaultRandomAccessIndices<Self> { }
}

Those line up, which is easy, but there is *another* protocol extension of RandomAccessIterator from which we can infer Indices:

extension RandomAccessCollection
where Index : Strideable,
      Index.Stride == IndexDistance,
      Indices == CountableRange<Index> {

  public var indices: CountableRange<Index> {
    return startIndex..<endIndex
  }
}

Note that both DefaultRandomAccessIndices<ReferenceCollection> and CountableRange<Int> would be valid inferences for Indices. We have three options:

1) Consider type inference to be ambiguous, because there is no natural ordering between the two protocol extensions (they have incompatible same-type constraints on the associated type Indices).
2) Consider the first protocol extension to “win” because… we prefer the extension which corresponds to the associated type default (?). This would be consistent with a world where we don’t have associated type inference at all. (It also matches Swift 3.0.1’s behavior).
3) Consider the second protocol extension to “win” because…the other protocol extension corresponds to the associated type default, and could therefore be considered to be a lowest-common-denominator implementation only there to provide the most basic defaults.

I can see the appeal of option 3, but IMO anything other than option 1 seems pretty brittle. Presumably with that option, and with the class providing a typealias for Indices, you would no longer have an ambiguity and the code would compile, correct?

Mark

···

On Nov 7, 2016, at 6:16 PM, Douglas Gregor via swift-dev <swift-dev@swift.org> wrote:

For reference, Swift 3.0.1 picked DefaultRandomAccessIndices<ReferenceCollection>, current Swift master picks CountableRange<Int>, and my work-in-progress to improve the type checker calls it ambiguous, hence the question :)

  - Doug

_______________________________________________
swift-dev mailing list
swift-dev@swift.org
https://lists.swift.org/mailman/listinfo/swift-dev

Hi all,

While working on the type checker, I came across an interesting case for associated type inference
with the ‘Indices’ type of RandomAccessCollection. At issue is a simple model of
RandomAccessCollection where the Index type is Int:

class ReferenceCollection : RandomAccessCollection {
  typealias Index = Int

  var startIndex: Int {
    return 0
  }

  var endIndex: Int {
    return 1
  }

  subscript(index: Int) -> String {
    return ""
  }

  func index(after i: Int) -> Int {
    return 1
  }

  func index(before i: Int) -> Int {
    return 0
  }
}

What’s the inferred associated Indices? The RandomAccessIterator protocol has a default:

protocol RandomAccessCollection {
    associatedtype Indices : _RandomAccessIndexable, BidirectionalCollection
      = DefaultRandomAccessIndices<Self>
    var indices: Indices { get }
}

which will kick in if nothing else can be inferred. There is also an implementation for this
defaulted case in a protocol extension from which we can infer Indices:

extension RandomAccessCollection where Indices == DefaultRandomAccessIndices<Self> {
   public var indices: DefaultRandomAccessIndices<Self> { }
}

Those line up, which is easy, but there is *another* protocol
extension of RandomAccessIterator from which we can infer Indices:

extension RandomAccessCollection
where Index : Strideable,
      Index.Stride == IndexDistance,
      Indices == CountableRange<Index> {

  public var indices: CountableRange<Index> {
    return startIndex..<endIndex
  }
}

Note that both DefaultRandomAccessIndices<ReferenceCollection> and CountableRange<Int> would be
valid inferences for Indices. We have three options:

1) Consider type inference to be ambiguous, because there is no natural ordering between the two
protocol extensions (they have incompatible same-type constraints on
the associated type Indices).

That seems reasonable, but I would like to have a way to *create* such a
natural ordering.

2) Consider the first protocol extension to “win” because… we prefer
the extension which corresponds to the associated type default
(?).

Up until now, specific extensions have never behaved like (or at least,
have never been intended to behave like) distinguishable entities in the
user model; I'm wary of entering that world, though I know it has been
discussed w.r.t. conditional conformances.

This would be consistent with a world where we don’t have
associated type inference at all. (It also matches Swift 3.0.1’s
behavior).

?? This statement makes no sense to me. If there's no associated type
inference, what would it mean for this extension to "win?"

···

on Mon Nov 07 2016, Douglas Gregor <swift-dev-AT-swift.org> wrote:

3) Consider the second protocol extension to “win” because…the other
protocol extension corresponds to the associated type default, and
could therefore be considered to be a lowest-common-denominator
implementation only there to provide the most basic defaults.

For reference, Swift 3.0.1 picked
DefaultRandomAccessIndices<ReferenceCollection>, current Swift master
picks CountableRange<Int>, and my work-in-progress to improve the type
checker calls it ambiguous, hence the question :)

  - Doug

_______________________________________________
swift-dev mailing list
swift-dev@swift.org
https://lists.swift.org/mailman/listinfo/swift-dev

--
-Dave

Hi all,

While working on the type checker, I came across an interesting case for associated type inference with the ‘Indices’ type of RandomAccessCollection. At issue is a simple model of RandomAccessCollection where the Index type is Int:

class ReferenceCollection : RandomAccessCollection {
  typealias Index = Int
  
  var startIndex: Int {
    return 0
  }

  var endIndex: Int {
    return 1
  }

  subscript(index: Int) -> String {
    return ""
  }

  func index(after i: Int) -> Int {
    return 1
  }

  func index(before i: Int) -> Int {
    return 0
  }
}

What’s the inferred associated Indices? The RandomAccessIterator protocol has a default:

protocol RandomAccessCollection {
    associatedtype Indices : _RandomAccessIndexable, BidirectionalCollection
      = DefaultRandomAccessIndices<Self>
    var indices: Indices { get }
}

which will kick in if nothing else can be inferred. There is also an implementation for this defaulted case in a protocol extension from which we can infer Indices:

extension RandomAccessCollection where Indices == DefaultRandomAccessIndices<Self> {
   public var indices: DefaultRandomAccessIndices<Self> { }
}

Those line up, which is easy, but there is *another* protocol extension of RandomAccessIterator from which we can infer Indices:

extension RandomAccessCollection
where Index : Strideable,
      Index.Stride == IndexDistance,
      Indices == CountableRange<Index> {

  public var indices: CountableRange<Index> {
    return startIndex..<endIndex
  }
}

Note that both DefaultRandomAccessIndices<ReferenceCollection> and CountableRange<Int> would be valid inferences for Indices. We have three options:

1) Consider type inference to be ambiguous, because there is no natural ordering between the two protocol extensions (they have incompatible same-type constraints on the associated type Indices).
2) Consider the first protocol extension to “win” because… we prefer the extension which corresponds to the associated type default (?). This would be consistent with a world where we don’t have associated type inference at all. (It also matches Swift 3.0.1’s behavior).
3) Consider the second protocol extension to “win” because…the other protocol extension corresponds to the associated type default, and could therefore be considered to be a lowest-common-denominator implementation only there to provide the most basic defaults.

I can see the appeal of option 3, but IMO anything other than option 1 seems pretty brittle. Presumably with that option, and with the class providing a typealias for Indices, you would no longer have an ambiguity and the code would compile, correct?

Yes, adding an explicit typealias (to either of them) fixes the issue.

  - Doug

···

Sent from my iPhone

On Nov 7, 2016, at 7:07 PM, Mark Lacey <mark.lacey@apple.com> wrote:

On Nov 7, 2016, at 6:16 PM, Douglas Gregor via swift-dev <swift-dev@swift.org> wrote:

Mark

For reference, Swift 3.0.1 picked DefaultRandomAccessIndices<ReferenceCollection>, current Swift master picks CountableRange<Int>, and my work-in-progress to improve the type checker calls it ambiguous, hence the question :)

  - Doug

_______________________________________________
swift-dev mailing list
swift-dev@swift.org
https://lists.swift.org/mailman/listinfo/swift-dev

Voting for 1. This is an ambiguity in stdlib through and through IMO.

~Robert Widmann

2016/11/07 22:07、Mark Lacey via swift-dev <swift-dev@swift.org> のメッセージ:

···

On Nov 7, 2016, at 6:16 PM, Douglas Gregor via swift-dev <swift-dev@swift.org> wrote:

Hi all,

While working on the type checker, I came across an interesting case for associated type inference with the ‘Indices’ type of RandomAccessCollection. At issue is a simple model of RandomAccessCollection where the Index type is Int:

class ReferenceCollection : RandomAccessCollection {
  typealias Index = Int
  
  var startIndex: Int {
    return 0
  }

  var endIndex: Int {
    return 1
  }

  subscript(index: Int) -> String {
    return ""
  }

  func index(after i: Int) -> Int {
    return 1
  }

  func index(before i: Int) -> Int {
    return 0
  }
}

What’s the inferred associated Indices? The RandomAccessIterator protocol has a default:

protocol RandomAccessCollection {
    associatedtype Indices : _RandomAccessIndexable, BidirectionalCollection
      = DefaultRandomAccessIndices<Self>
    var indices: Indices { get }
}

which will kick in if nothing else can be inferred. There is also an implementation for this defaulted case in a protocol extension from which we can infer Indices:

extension RandomAccessCollection where Indices == DefaultRandomAccessIndices<Self> {
   public var indices: DefaultRandomAccessIndices<Self> { }
}

Those line up, which is easy, but there is *another* protocol extension of RandomAccessIterator from which we can infer Indices:

extension RandomAccessCollection
where Index : Strideable,
      Index.Stride == IndexDistance,
      Indices == CountableRange<Index> {

  public var indices: CountableRange<Index> {
    return startIndex..<endIndex
  }
}

Note that both DefaultRandomAccessIndices<ReferenceCollection> and CountableRange<Int> would be valid inferences for Indices. We have three options:

1) Consider type inference to be ambiguous, because there is no natural ordering between the two protocol extensions (they have incompatible same-type constraints on the associated type Indices).
2) Consider the first protocol extension to “win” because… we prefer the extension which corresponds to the associated type default (?). This would be consistent with a world where we don’t have associated type inference at all. (It also matches Swift 3.0.1’s behavior).
3) Consider the second protocol extension to “win” because…the other protocol extension corresponds to the associated type default, and could therefore be considered to be a lowest-common-denominator implementation only there to provide the most basic defaults.

I can see the appeal of option 3, but IMO anything other than option 1 seems pretty brittle. Presumably with that option, and with the class providing a typealias for Indices, you would no longer have an ambiguity and the code would compile, correct?

Mark

For reference, Swift 3.0.1 picked DefaultRandomAccessIndices<ReferenceCollection>, current Swift master picks CountableRange<Int>, and my work-in-progress to improve the type checker calls it ambiguous, hence the question :)

  - Doug

_______________________________________________
swift-dev mailing list
swift-dev@swift.org
https://lists.swift.org/mailman/listinfo/swift-dev

_______________________________________________
swift-dev mailing list
swift-dev@swift.org
https://lists.swift.org/mailman/listinfo/swift-dev

Hi all,

While working on the type checker, I came across an interesting case for associated type inference
with the ‘Indices’ type of RandomAccessCollection. At issue is a simple model of
RandomAccessCollection where the Index type is Int:

class ReferenceCollection : RandomAccessCollection {
typealias Index = Int

var startIndex: Int {
   return 0
}

var endIndex: Int {
   return 1
}

subscript(index: Int) -> String {
   return ""
}

func index(after i: Int) -> Int {
   return 1
}

func index(before i: Int) -> Int {
   return 0
}
}

What’s the inferred associated Indices? The RandomAccessIterator protocol has a default:

protocol RandomAccessCollection {
   associatedtype Indices : _RandomAccessIndexable, BidirectionalCollection
     = DefaultRandomAccessIndices<Self>
   var indices: Indices { get }
}

which will kick in if nothing else can be inferred. There is also an implementation for this
defaulted case in a protocol extension from which we can infer Indices:

extension RandomAccessCollection where Indices == DefaultRandomAccessIndices<Self> {
  public var indices: DefaultRandomAccessIndices<Self> { }
}

Those line up, which is easy, but there is *another* protocol
extension of RandomAccessIterator from which we can infer Indices:

extension RandomAccessCollection
where Index : Strideable,
     Index.Stride == IndexDistance,
     Indices == CountableRange<Index> {

public var indices: CountableRange<Index> {
   return startIndex..<endIndex
}
}

Note that both DefaultRandomAccessIndices<ReferenceCollection> and CountableRange<Int> would be
valid inferences for Indices. We have three options:

1) Consider type inference to be ambiguous, because there is no natural ordering between the two
protocol extensions (they have incompatible same-type constraints on
the associated type Indices).

That seems reasonable, but I would like to have a way to *create* such a
natural ordering.

One such way is to drop the same-type requirement (Indices == DefaultRandomAccessIndices<Self>) from the first extension, making it an unconstrained extension and, therefore, more general than the second (constrained) extension. I think that’s the best solution here. The downside is that a concrete type like ‘ReferenceCollection’ will have the subscript operators from both RandomAccessCollection extensions. That’s a problem I think we should solve more generally, perhaps with some name-shadowing rule or keyword to say “only use this declaration to satisfy a requirement and for nothing else”.

I’ll go ahead with this solution for now.

2) Consider the first protocol extension to “win” because… we prefer
the extension which corresponds to the associated type default
(?).

Up until now, specific extensions have never behaved like (or at least,
have never been intended to behave like) distinguishable entities in the
user model; I'm wary of entering that world, though I know it has been
discussed w.r.t. conditional conformances.

This would be consistent with a world where we don’t have
associated type inference at all. (It also matches Swift 3.0.1’s
behavior).

?? This statement makes no sense to me. If there's no associated type
inference, what would it mean for this extension to "win?"

If there’s no associated type inference, one would get the associated type default, DefaultRandomAccessIndices<Self>.

  - Doug

···

On Nov 8, 2016, at 1:58 PM, Dave Abrahams via swift-dev <swift-dev@swift.org> wrote:
on Mon Nov 07 2016, Douglas Gregor <swift-dev-AT-swift.org <http://swift-dev-at-swift.org/&gt;&gt; wrote:

Voting for 1. This is an ambiguity in stdlib through and through IMO.

No, this is the standard library doing the best it can with a type
checker that has mostly-unspecified semantics. It can only be
considered an ambiguity in the standard library if you presume the
semantics of choice 1, which was never specified... hence Doug's
question.

···

on Mon Nov 07 2016, Robert Widmann <swift-dev-AT-swift.org> wrote:

~Robert Widmann

2016/11/07 22:07、Mark Lacey via swift-dev <swift-dev@swift.org> のメッセージ:

On Nov 7, 2016, at 6:16 PM, Douglas Gregor via swift-dev <swift-dev@swift.org> wrote:

Hi all,

While working on the type checker, I came across an interesting case for associated type inference with the ‘Indices’ type of RandomAccessCollection. At issue is a simple model of RandomAccessCollection where the Index type is Int:

class ReferenceCollection : RandomAccessCollection {
  typealias Index = Int
  
  var startIndex: Int {
    return 0
  }

  var endIndex: Int {
    return 1
  }

  subscript(index: Int) -> String {
    return ""
  }

  func index(after i: Int) -> Int {
    return 1
  }

  func index(before i: Int) -> Int {
    return 0
  }
}

What’s the inferred associated Indices? The RandomAccessIterator protocol has a default:

protocol RandomAccessCollection {
    associatedtype Indices : _RandomAccessIndexable, BidirectionalCollection
      = DefaultRandomAccessIndices<Self>
    var indices: Indices { get }
}

which will kick in if nothing else can be inferred. There is also an implementation for this defaulted case in a protocol extension from which we can infer Indices:

extension RandomAccessCollection where Indices == DefaultRandomAccessIndices<Self> {
   public var indices: DefaultRandomAccessIndices<Self> { }
}

Those line up, which is easy, but there is *another* protocol extension of RandomAccessIterator from which we can infer Indices:

extension RandomAccessCollection
where Index : Strideable,
      Index.Stride == IndexDistance,
      Indices == CountableRange<Index> {

  public var indices: CountableRange<Index> {
    return startIndex..<endIndex
  }
}

Note that both DefaultRandomAccessIndices<ReferenceCollection> and CountableRange<Int> would be valid inferences for Indices. We have three options:

1) Consider type inference to be ambiguous, because there is no natural ordering between the two protocol extensions (they have incompatible same-type constraints on the associated type Indices).
2) Consider the first protocol extension to “win” because… we prefer the extension which corresponds to the associated type default (?). This would be consistent with a world where we don’t have associated type inference at all. (It also matches Swift 3.0.1’s behavior).
3) Consider the second protocol extension to “win” because…the other protocol extension corresponds to the associated type default, and could therefore be considered to be a lowest-common-denominator implementation only there to provide the most basic defaults.

I can see the appeal of option 3, but IMO anything other than option
1 seems pretty brittle. Presumably with that option, and with the
class providing a typealias for Indices, you would no longer have an
ambiguity and the code would compile, correct?

Mark

For reference, Swift 3.0.1 picked
DefaultRandomAccessIndices<ReferenceCollection>, current Swift
master picks CountableRange<Int>, and my work-in-progress to
improve the type checker calls it ambiguous, hence the question :)

  - Doug

_______________________________________________
swift-dev mailing list
swift-dev@swift.org
https://lists.swift.org/mailman/listinfo/swift-dev

_______________________________________________
swift-dev mailing list
swift-dev@swift.org
https://lists.swift.org/mailman/listinfo/swift-dev

_______________________________________________
swift-dev mailing list
swift-dev@swift.org
https://lists.swift.org/mailman/listinfo/swift-dev

--
-Dave

Hi all,

While working on the type checker, I came across an interesting case for associated type inference
with the ‘Indices’ type of RandomAccessCollection. At issue is a simple model of
RandomAccessCollection where the Index type is Int:

class ReferenceCollection : RandomAccessCollection {
typealias Index = Int

var startIndex: Int {
   return 0
}

var endIndex: Int {
   return 1
}

subscript(index: Int) -> String {
   return ""
}

func index(after i: Int) -> Int {
   return 1
}

func index(before i: Int) -> Int {
   return 0
}
}

What’s the inferred associated Indices? The RandomAccessIterator protocol has a default:

protocol RandomAccessCollection {
   associatedtype Indices : _RandomAccessIndexable, BidirectionalCollection
     = DefaultRandomAccessIndices<Self>
   var indices: Indices { get }
}

which will kick in if nothing else can be inferred. There is also an implementation for this
defaulted case in a protocol extension from which we can infer Indices:

extension RandomAccessCollection where Indices == DefaultRandomAccessIndices<Self> {
  public var indices: DefaultRandomAccessIndices<Self> { }
}

Those line up, which is easy, but there is *another* protocol
extension of RandomAccessIterator from which we can infer Indices:

extension RandomAccessCollection
where Index : Strideable,
     Index.Stride == IndexDistance,
     Indices == CountableRange<Index> {

public var indices: CountableRange<Index> {
   return startIndex..<endIndex
}
}

Note that both DefaultRandomAccessIndices<ReferenceCollection> and CountableRange<Int> would be
valid inferences for Indices. We have three options:

1) Consider type inference to be ambiguous, because there is no natural ordering between the two
protocol extensions (they have incompatible same-type constraints on
the associated type Indices).

That seems reasonable, but I would like to have a way to *create* such a
natural ordering.

One such way is to drop the same-type requirement (Indices ==
DefaultRandomAccessIndices<Self>) from the first extension, making it
an unconstrained extension and, therefore, more general than the
second (constrained) extension. I think that’s the best solution
here. The downside is that a concrete type like ‘ReferenceCollection’
will have the subscript operators from both RandomAccessCollection
extensions. That’s a problem I think we should solve more generally,
perhaps with some name-shadowing rule or keyword to say “only use this
declaration to satisfy a requirement and for nothing else”.

I like the latter very much, and it is a good use-case for an explicit
"override" on protocol methods.

I’ll go ahead with this solution for now.

2) Consider the first protocol extension to “win” because… we prefer
the extension which corresponds to the associated type default
(?).

Up until now, specific extensions have never behaved like (or at least,
have never been intended to behave like) distinguishable entities in the
user model; I'm wary of entering that world, though I know it has been
discussed w.r.t. conditional conformances.

This would be consistent with a world where we don’t have
associated type inference at all. (It also matches Swift 3.0.1’s
behavior).

?? This statement makes no sense to me. If there's no associated type
inference, what would it mean for this extension to "win?"

If there’s no associated type inference, one would get the associated type default,
DefaultRandomAccessIndices<Self>.

To me that just sounds like more-limited inference, but OK.

···

on Tue Nov 08 2016, Douglas Gregor <dgregor-AT-apple.com> wrote:

On Nov 8, 2016, at 1:58 PM, Dave Abrahams via swift-dev <swift-dev@swift.org> wrote:
on Mon Nov 07 2016, Douglas Gregor <swift-dev-AT-swift.org <http://swift-dev-at-swift.org/&gt;&gt; > wrote:

--
-Dave