What are the rules on inheriting associated types from protocols?

I've never been too clear on how associated types and type aliases trickle down from parent protocols. Here's a simplified version of a real relevant Collection-y problem. Can this error…

Reference to invalid associated type 'Index' of type 'S'

…be resolved without having to manually use Int within S or one of its extensions, if the subscript does indeed need to be defined at the level of S and not higher up the chain?

public protocol P: RandomAccessCollection where Index == Int {}

extension P {
  public var startIndex: Index { 0 }
}
public struct S { }

extension S: P {
  public var endIndex: Index { 0 }
  public subscript(index: Index) -> Void { () }
}

If you declare it as struct S: P does the compiler error go away? Or if you try the latest 5.7 compiler, does that improve it?

  1. No.

  2. Different sort of no. I wouldn't try a thing with a lower version anymore.

Your code example works if I write:

public protocol P: RandomAccessCollection where Index == Int {
  associatedtype Index
}

"Re-stating" an associated type in this manner is a no-op -- the generics implementation doesn't consider it as a distinct entity from Index in the parent protocol, which is constrained to Int -- but it helps nudge associated type inference into figuring out what's going on.

This is a bug and we should fix it some day, but it's not quite as straightforward to fix I think. Associated type inference in general is kind of broken and needs a complete overhaul.

11 Likes

The use case I tried to simplify isn't solved by your suggestion. Is it currently solvable with another trick?

public protocol BackedByInteger
: ExpressibleByIntegerLiteral & Hashable & MutableCollection & RandomAccessCollection
where Index == Int {
  associatedtype Integer: FixedWidthInteger & _ExpressibleByBuiltinIntegerLiteral
  
  init(_: Integer)
}

// MARK: - ExpressibleByIntegerLiteral
extension BackedByInteger {
  public init(integerLiteral integer: Integer) {
    self.init(integer)
  }
}

// MARK: - MutableCollection, RandomAccessCollection
extension BackedByInteger {
  public var startIndex: Index { 0 }
}
/// The bits of an integer, from least significant to most.
public struct Bits<Integer: FixedWidthInteger & _ExpressibleByBuiltinIntegerLiteral> {
  public var integer: Integer
}

// MARK: - Collection
extension Bits: BackedByInteger {
  public var endIndex: Index { Integer.bitWidth }

  public subscript(index: Index) -> Integer {
    get { integer >> index & 1 }
    set {
      integer &= ~(1 << index)
      integer |= (newValue & 1) << index
    }
  }
}

// MARK: - BackedByInteger
extension Bits {
  public init(_ integer: Integer) {
    self.integer = integer
  }
}
/// The nybbles of an integer, from least significant to most.
public struct Nybbles<Integer: FixedWidthInteger & _ExpressibleByBuiltinIntegerLiteral> {
  public var integer: Integer
}

// MARK: - Collection
extension Nybbles {
  public var endIndex: Index { Integer.bitWidth / 4 }

  public subscript(index: Index) -> Integer {
    get { integer >> (index * 4) & 0xF }
    set {
      let index = index * 4
      integer &= ~(0xF << index)
      integer |= (newValue & 0xF) << index
    }
  }
}

// MARK: - BackedByInteger
extension Nybbles: BackedByInteger {
  public init(_ integer: Integer) {
    self.integer = integer
  }
}

Can you try adding a declaration associatedtype Index to protocol BackedByInteger?

That's what I'm saying—it works for the simplified example but not there.

I don't like having to make the arbitrary decision of the one place to put Int so I just restructured the whole concept :melting_face::

SmallIntegerCollection
public extension BinaryInteger {
  /// The bits of an integer, from least significant to most.
  var bits: SmallIntegerCollection<Self> {
    get { .init(container: self, elementBitWidth: 1) }
    set { self = newValue.container }
  }

  /// The nybbles of an integer, from least significant to most.
  var nybbles: SmallIntegerCollection<Self> {
    get { .init(container: self, elementBitWidth: 4) }
    set { self = newValue.container }
  }
}

/// A collection of integers which "fit" into a larger integer type.
///
/// `Container` is an integer that can contain `Container.bitWidth / elementBitWidth` elements.
/// `SmallIntegerCollection` divides it evenly; indexing is performed from least significant divisions to most.
public struct SmallIntegerCollection<Container: BinaryInteger> {
  public init(container: Container, elementBitWidth: Int) {
    self.container = container
    self.elementBitWidth = elementBitWidth
  }

  /// The backing storage unit for this collection.
  public var container: Container

  /// The number of bits needed for the underlying binary representation of each element of this collection.
  public var elementBitWidth: Int
}

// MARK: - public
public extension SmallIntegerCollection {
  /// An element with `1` for all bits.
  var mask: Container { ~(~0 << elementBitWidth) }
}

// MARK: - MutableCollection & RandomAccessCollection
extension SmallIntegerCollection: MutableCollection & RandomAccessCollection {
  public typealias Index = Int

  public var startIndex: Index { 0 }
  public var endIndex: Index { Container().bitWidth / elementBitWidth }

  public subscript(index: Index) -> Container {
    get { container >> (index * elementBitWidth) & mask }
    set {
      let index = index * elementBitWidth
      container &= ~(mask << index)
      container |= (newValue & mask) << index
    }
  }
}

More sadness now that primary associated types are thrown into the mix:

An associated type named 'Key' must be declared in the protocol 'ExpressibleByKeyValuePair' or a protocol it inherits

public protocol ExpressibleBy2Tuple {
  associatedtype Element0
  associatedtype Element1

  init(_: (Element0, Element1))
}

public protocol ExpressibleByKeyValuePair<Key, Value>: ExpressibleBy2Tuple {
  typealias Key = Element0
  typealias Value = Element1
}
1 Like

fwiw, these are the problems leading me to turn away from Swift.

I regularly hit associated-type problems I cannot solve or even understand. I've been a staff engineer for multiple large teams and once worked on a programming language. Jessy is a long-time and deep contributor. We should be able to get these sorted without forum posts. Previously, I've held out hope that I'm simply misguided, and will eventually get the correct understanding.

Now the prospect that they're (design?) bugs too hard to fix has huge implications for the projects I'm planning. I'm deeply saddened, really, because I've grown to love Swift.

I'm very grateful for direct confession of flaws, because that helps with partial solutions -- avoidance or moving forward via workarounds. Any pointers to bugs or discussions characterizing the problem would be welcome. :pray:

1 Like

Are we talking about associated type inference or are you having trouble learning Swift generics and protocols with associated types in general?

"Associated type inference" is the specific feature that inserts implicit typealiases into types that implement protocols with associated types. Here is a simple case:

protocol P {
  associatedtype A
  func f() -> A
}

struct S: P {
  func f() -> Int {}
}

You can omit the typealias A = Int in struct S because it's inferred from f().

You can usually just define the type aliases yourself instead, writing this also works fine:

struct S: P {
  typealias A = Int
  func f() -> Int {}
}

There are bugs with some more complex cases of the above but they are certainly not too hard to fix, it just needs some focused effort in this area.

3 Likes