Formalizing a Numerical/ML Working Group

@Troy_Harvey and @porterchild thank you for taking the time to answer these questions, this has been very helpful.

I'm thinking my intuition was pointing in the same direction as you all, agree that neural nets from my limited experience seemed too limiting. I think I've been approaching that problem with, how can we make a more abstract data-structure/model over or even above tensors and tensor products (in the mathy and not strictly multi-dim array sense)? I think I'm mostly sold on what you're proposing though which sounds like starting with a less constrained ontology that could include any arbitrary differential type, which gives user's the freedom to not be constrained by any specific kind of data-structure. I'll certainly keep researching in the direction I have been out of curiosity. I'm also very interested in seeing what this definition language ultimately looks like and how it operates!

On that note, if we (as in the community) were to start a working group how could that operate? How could we organize around building a roadmap? Is this definition language planned to be a future piece of it?

It's a good question, and why I started this thread.

First, Swift has a unique set of attributes that aren't represented or perhaps aren't easily possible in other languages, and I think there is real opportunity to leverage and elevate that in the general ML community.

There are core building blocks and frameworks that are missing in Swift open source that either we have, others have built, or we can fill in the blanks together. Some of these pieces include introspection & metaprogramming tools, numerical methods, fixed arrays, distributed computing, dataframes, backend independent NN frameworks, constraint engines, generalized differentiable programming scaffolding. These topics are general, and helpful to many domains.

My 1st goal was to asses the community interest.

My 2nd goal was to navigate a more strategic approach to open sourcing our own frameworks than "throw them on GitHub and see what happens".

My 3rd goal was to see if there is enough high-level interest to build a working group roadmap for these building blocks. I'm a big believer in community anointed libraries, it builds so much more momentum in a community than random smatterings of GitHub libraries. Take Data Frames for example. There are probably 4 or 5 half baked and half built Swift Data Frame projects. In Python there is Pandas — everybody knows it and backs it.

My 4rd goal was to see if there was interest in housing such a working group — or if should be assembled outside these walls. We have some bandwidth to support, but we are busy building our own tech, so it can't be an uphill climb.

While S4TF was a good start, it was competing with institutional inertia by the owner (and within their walls, still searching for a killer application). Going forward, it's important to avoid that single source dependency. We have a large set of internal initiatives and frameworks, but also feel that distributed community investment is valuable. We started with Quantum, our DSL for physical systems AI. Quantum is on the path to have dozens of corporate backers by end of year. We are going to house that in a non-profit so it has more independence and community openness. I think Quantum is one such killer application for differentiable Swift. Given that we are exploring the open source warehousing of some of our supporting frameworks we think have general purpose community value under the same independent body.

I also should note, while we are investing growing our team to work on Swift AI tooling, I am also open to supporting open source efforts if groups or individuals have proposals.


I want to express my support for this working group. I don't have enough experience to make substantial contributions, yet, but I'm very interested in specializing in the subject. Count me in to help in any way I can on the formalization of the working group.


+1 on what @Troy_Harvey said. There are a lot of TensorFlow-specific operators in S4TF, which detract from the ability to extend it to other backends. One example is Lanczos image resampling, and another is the matrix factorization ops. I'm planning to remove anything that isn't directly related to machine learning. Although, I may add some linear algebra operators once I can draft custom kernels in MSL and OpenCL. Outside of S4TF, there would be no convenient way to run linear algebra on TPUs.

This decision partially depends on whether SwiftFusion will recognize my fork as the successor to Swift for TensorFlow and shift their dependency to it. There are examples of SVD and Cholesky decomposition in their code. Just out of curiosity, is there any big effort to make a standalone linear algebra library for Swift that mirrors NumPy or the linear algebra operators in XLA?

You can certainly count me in, however I'd likely rely on others insights into direction and priority given my limited experience in specifically ML. Though, I feel confident in learning what needs to be learned and contributing that knowledge into something material.

Curious what you had in mind?

Agreed wholeheartedly. Would it be worthwhile reaching out to the authors of those libraries directly and pointing them towards this forum?

I.e. core team involvement or something different? It is curious that I do not see a current core team member replying in this thread. Though nice to look back a year and see Chris Lattner's direct support.

Given the following:

The project lead makes senior appointments to leadership roles, with those leaders coming from the worldwide Swift community of contributors. ... Ted Kremenek is the appointed representative from Apple, and acts as the voice of the project lead. Source

and that such a decision would likely require appointment.

@tkremenek what may it require to form a new working group for ML under

Procedurally, the Core Team will probably be tied up for the next few weeks at least carrying out the reorganization. But one of the goals of that reorganization is to better support having working groups for diverse efforts like this, and just speaking for myself, I think having a Swift ML working group would make a lot of sense.


Just to add to this discussion about Tensor terminology, I think the term Matrix is a more generic term suitable for a wider range of useful numerical operations. Tensors are often represented as matrices, and therefore plenty of matrix operations have meaning as tensor operations. However, there are plenty of useful things one can do with matrices that don't make sense as tensors.

1 Like

This confuses me a bit. Tensors are actually a more generic object than matrices, and therefore suitable for a wider range of operations. Anything you do with a matrix you can do with a tensor. The reverse is not true.
Maybe I'm mixing things up however and the mathematical tensor object is completely unrelated or at least dissimilar to the tensor object being discussed here.

Tensor have a far more restrictive set of requirements than a matrix does. For example, they have to changes basis following a particular set of rules (i.e., covariant and contravariant components)---and if they don't, they're not a tensor. So it's easy to construct matrices of data that don't follow tensor transformation rules, but in contrast, any tensor can be represented as a multi-dimensional matrix given a basis choice.

Ok, I think that's sort of what I meant. From any tensor you can always "drop down to the matrix level" given a choice of basis.

1 Like

To echo the more general point brought up a few times, you don't want to restrict your self to thinking only about tensors when designing a good numerical library, because that will unnecessarily restrict the applicability. For example, I do both a bunch of spectral modeling and a bunch of stochastic modeling---the former lends itself nicely to the tensor abstraction, but the latter does not---yet both use many of the same matrix operations.

1 Like

I think that we’re crossing wires in terms of “tensor” as a mathematical object and “tensor” as a data structure that’s used in machine learning. Oftentimes, people use the word “tensor” colloquially to refer to just a generalized matrix with a rank that may be greater than 2, without any of the connotations of consistency across bases, etc.

I suspect you are correct that in the last few years people have started abusing the terminology---and that's also why it's important to get a larger diversity of perspectives when creating a working group for a numerical computing library (so as not make that same kind of mistake). If the goal really is to create a numerical computing library to be used by a broader community, then I would hope one could recruit perspectives that aren't currently represented in these forums.

Edit: Let me add that my big worry is that a numerical library gets created that is unnecessarily very niche, and that's why I want to advocate for some broad thinking about different ways folks would use such a library. E.g., a discussion around Tensor vs Matrix vs Array is something that needs to happen with a very broad group, so as not to get trapped into one way of thinking.


The confusion of tensors and matrices seemed to bring limitations to S4TF. The matrix decomposition and matrix band ops were specific to one domain: linear algebra. The word "matrix" was even in the name of some raw ops - matrixBandPart, matrixSetDiag, to name a few.

The linear algebra component of S4TF wasn't close to feature-complete with Python TF's linalg, yet it limited S4TF's extensibility to other backends. Everything else operated on varied-dimensional tensors (e.g. Conv2D, Conv3D, depthToSpace) that are supported by MPSGraph and DirectML. The matrix ops could be removed from S4TF and reimplemented in a more feature-complete "numerical computing library".

1 Like

It seems that there may be enough interest in Numerical/ML to get an official working group. I would be willing to join if there was a video call kick-off. Does anyone else feel the same way?


I would be excited to be a part of this :+1:


I do!

What are the responsibilities and the expected skills of a working group member? I would love to join and contribute if my skills are beneficial for the team.

Anything, I may be lacking currently, I can learn and I do have the time.

1 Like

You would have to ask a project leader like @tkremenek, but here's my expectation. I don't expect that it will require a lot of time and dedication, because 1) we're a very small subset of the Swift community. I don't even know how many people would attend; that's why I asked on this thread. If Swift Forums has an official polling mechanism, that would be great for a moderator to employ here.

And 2) I think most people on this thread are not being hired full-time to work on using Swift for ML. In contrast, the C++ interop working group has multiple full-time contributors. We will just occasionally meet on Zoom and talk about it. We don't know much about how Swift is used outside of S4TF and Quantum, and there are a lot of ideas to discuss. I already have a few questions, like whether anybody else has used Swift-Colab or thought of a Swift substitute for NumPy.

1 Like

I'm in! Should we maybe find some official place where we can communicate and share our experiences? I don't think that this thread quite does the job