Is it possible to assign typealias for 2D dimension array in Swift? Like `typealias MatrixT<T> = [[T]]`

I'd like assign typealias for 2D generic array. ( I don't want to create new type struct MatrixT<T>{} ). So I do next:

typealias MatrixT<T> = [[T]]

But when I started write extension for it I found that compiler actually doesn't understand that type MatrixT is 2D array. It recognizes type of self as [Element]

extension MatrixT {
    var columnsCount: Int {
        let copy = self \\compilator recognizes this as let copy: [Element] = self
        let row = self[0] \\compilator recognizes this as  let row: Element
        return 0
    }
}

But outside of the extension the Swift compiler understands that element of MatrixT is array.

func testCreation() {
    let matrix: MatrixT = [[0]]
    let firstRow:[Int] = matrix[0] \\ correct
    let columnsCount = firstRow.count
}

Why I can't refer to type MatrixT in the extension as to 2D array ( [[T]] )?

Element and T are different types

MatrixT is an array of arrays of T. If you write extension of MatrixT, you're writing the extension of that outer array, which means that the elements of it are arrays of T, not just T

You can try to print Element.self, or type(of: self[0]) to check that out

You can. [[T]] is different way to write [Element] in the context of extension of MatrixT

I hope my explanation isn't confusing

2 Likes

Yep, thank you. Now it's clear.

1 Like

Note that, while you can do this, it's a pretty bad abstraction for matrices, since it doesn't enforce the constraint that all the columns have the same count. I would advise against it, unless you have a specific use case in mind where that's not an issue.

4 Likes

Yes, this.

I would abstract the nested array in a wrapper type, generic on the element type, restricting the element type to Numeric or similar, where that type's initializer takes a row and column count. You probably want an alternate initializer that adds a sequence or closure argument to get the first values.

Behind a wrapper type, you can regulate the allocations, and keep the user from making the nested array ragged. Your subscript would still use two coordinates. Since your allocations are hidden, you can flatten the storage to a single [T] instead of the fractured allocations of [[T]] and remap the two public coordinates to a single internal index. String does a similar optimization; it flattens its data to a single block of UTF-8 code units.

You may want to provide two view types, one for a collection of row vectors and another for a collection of column vectors.

Personally, my Matrix type has an unconstrained Element, and then various constrained extensions to provide numeric-specific behaviors. Something like:

struct Matrix<Element> {
  private(set) var rowCount: Int
  private(set) var columnCount: Int
  private(set) var elements: [Element]
  
  init(rows: Int, columns: Int, elements: [Element]) { /*...*/ }
  subscript(_ row: Int, _ column: Int) -> Element { /*...*/ }
}

extension Matrix where Element: AdditiveArithmetic { /*...*/ }
extension Matrix where Element: Numeric { /*...*/ }
extension Matrix where Element: FloatingPoint { /*...*/ }

// etc.
3 Likes

Very elegant. :+1:

do you have some more on this?

I don't think this was explained well. While there are better way to do matrices, as a user I'd expect this to work:

typealias MatrixT<T> = [[T]]

extension MatrixT {
    var columnCount: Int {
        // self[0].count //🛑 Error
        (self[0] as! [Any]).count // workaround, is this O(n)?
    }
    var rowCount: Int {
        self.count
    }
    subscript(x x: Int, y y: Int) -> T { //🛑 Error
        self[y][x] //🛑 Error, no workaround
    }
}

let m = MatrixT(repeating: [Int](repeating: 1, count: 10), count: 10)
print(m.columnCount, m.rowCount) // 10 10
print(m[x: 1, y: 1]) // expecting to get: 1

This works:

typealias MatrixT<T> = [[T]]

extension MatrixT where Element == [Double] {
    var columnCount: Int { self[0].count }
    var rowCount: Int { self.count }
    subscript(x x: Int, y y: Int) -> Double { self[y][x] }
}

let m = MatrixT(repeating: [Double](repeating: 1, count: 10), count: 10)
print(m.columnCount, m.rowCount)
print(m[x: 1, y: 1])

But that's not exactly what's wanted... as the Double type is hardcoded.