[Pitch] Formally defining consuming and nonconsuming argument type modifiers

Right, your mental model is much closer to Rust's ownership. In Swift, it's more like you own a reference to that particular object. There can be multiple references of the same object, each with different owners, and each owner is responsible for discarding their references (i.e., stop using the reference and call release). You can then own a new reference to the object by copying the reference and call retain.


In non-consuming case, the call doesn't increase reference count. So it can be thought as a single reference shared by two (or multiple) owners. Maybe this is why it's called __shared in current implementation :smiley:

Yes, well, maybe. It's more useful to treat each reference as having a single owner (i.e., the one that will call release). If you're using the reference but did not participate in the reference counting, you could also say that you're simply borrowing the reference (and will return it to the original owner afterward). It does help you keep track of

  • Who will call release (owner), and
  • When can release be called (after all borrowers return the reference).

This is why you can see borrowing being proposed as an alternative for nonconsuming and sharing.

FWIW, Much of the ownership concept predates ARC; see Memory Management Policy and Ownership Policy. Though I'm not sure where the concept of borrowing begins in this ecosystem. If you're still curious about that, you can spin a new thread.


borrowing can mean immutable borrowing and mutable borrowing,
for e.g.:

func testBorrowing(_ arg1: borrowing Int){

is it clear what kind of borrowing it means? Mutable or immutable?

Anyone can mutate the object (simply by assigning it to a new variable*). Whether the mutation is observable to the caller depends on whether the argument is marked as inout** (or the object has reference semantic). If you want the mutation not to incur CoW, you must ensure that the reference is unique and pass the argument as consuming.

Also, these markings don't mean much to trivial types, which have no ARC traffic. We wouldn't want to reject that outright, though. That'd be too magical, and we need to consider such a combination anyway for cases like generics.

Ok, many people seem to be confused about the term borrowing. It shows that that may not be the best word choice or that Rust's influence is simply too strong.

* Arguably, that just creates a copy, then mutate that copy. However, that is also the case for all CoW types with non-unique references.
** We might want to be careful around inout. In addition to ARC, it also plays a role in Exclusivity Rules.

I'm quite happy to finally have more official tools in the language to control memory ownership when necessary. +1

Thanks for that, it clarified a bit how it worked. While reading I felt like I forgot my days using MRC! I'm actually still a bit confused why the +1 of a consuming arg needs to be in the caller and not the callee, probably obvious but I'm not seeing it right now. :thinking:

In terms of naming I guess consume/nonconsume is getting into people's heads already but for me is more confusing than the own/borrow words used in Rust. That said in that language things are more clear/explicit so they are also less confusing.

For example, right now using these keywords doesn't change the semantics of our code, since Swift with insert ARC/copies when needed anyway right? It would only matter for semantics once we have move-only types (or the @noImplicitCopy mentioned in the roadmap) since at that point a consumed argument won't be able to be used in the caller after the call. Corrrect?

From the roadmap:

Would it be useful to specify in the proposal what role inout plays (if any) in this?

One of the unique things about the inout parameter modifier is that any corresponding argument variable must be prefixed with an & symbol.

func doubleInPlace(number: inout Int) {
    number *= 2

var myNum = 10 
doubleInPlace(number: &myNum)
print(myNum)  // 20

It lets the reader (and presumably the memory safety checker in the compiler) know that the caller intends to pass an argument by-reference (or by-value-result) instead of the default heuristic.

@Michael_Gottesman what are your thoughts on prefixing caller arguments with the & symbol (or any symbol) for consuming and/or nonconsuming callee parameters? Would this extra syntax be useful for the compiler’s safety checker at all? Should the reader be informed about an intentional change away from the default heuristic at the call-site?

The idea is that the caller, bar, retains an ownership to the consuming argument arg1 (+1), then gives that ownership to the callee, foo. At this point, foo owns the reference, and bar does not*. Since the foo now owns the reference (that bar gave), it is responsible for releasing it (-1).

This is the main idea of consuming. The caller retains and gives the retained reference to its callee. The callee then releases. It is ARC-efficient even across modules if

  • The caller does not use the argument afterward, and
  • The callee does keep the argument around.

Depending on the level of deviation from this scenario, we may see more (unnecessary) ARC traffic.

* If needed, it can retain another reference.

It matters for cross-module interfaces when the compiler can't optimize away the ARC traffic across module boundaries. Libraries like the standard library would be the biggest consumers (ha!).

That sounds like it would be either inconsistent (implicit non-consuming arguments would not get the & prefix, while explicit ones do) or source-breaking (implicit non-consuming arguments now require the prefix). It's quite a high price to pay in either case.

Could you elaborate more on this? AFAICT, let a = b is the non-consuming local variable assignment, let a = move(b) is the consuming one, and ref a = b or whatever it calls would be a +0 anyway. So I'm definitely missing something.

1 Like

Why would it ever be allowed to do that? Moving into a local variable wouldn't change the refcount, so it should never cause a release.

Well, I don’t know what a local ref means because that’s not concretely proposed. If you want to guarantee that something isn’t copied, you may need to borrow it from its current location, which means enforcing exclusivity on that location for as long as you need the borrow. Like inout but without the implication of mutation.


Using owned instead of consuming could mislead because Swift already has unowned keyword, which means unowned references, but owned would mean owned values.
And i doubt consuming is a good keyword.

The thing is, the compiler could actually know when that is the case. If it was left implicit, could the compiler actually choose between the two at each call site? It’d have the necessary information, unlike the programmer.

I feel like there’s room for compile-time heuristics that are more sophisticated than “Is this an initializer or setter?”. If there is, we may want to treat this more as a compiler hint (if you can’t tell which is better, do this) than a hard rule.

Right, I only vaguely remember ref from a post long ago, and I couldn't find it anymore. Please ignore it.

That said, I'm still not sure if consuming and nonconsuming are the right tools for the local borrowing (name TBD) you're looking for. consuming and nonconsuming as proposed don't even have any notion of exclusivity attached to it. The local borrowing seems more akin to inout and whatever non-mutating pass-by-reference argument convention is (normal arguments would make a copy, and doesn't even hold read access to the original value).

This hinges on the requirement that the compiler can emit different binaries for the same function, which isn't the case everywhere, and most definitely not at module boundaries.

In the scenarios where the compiler can do that, such as within the same file or the same module (with WMO enabled), much of these call conventions won't do much (maybe a bit since it could help guide the compiler toward a better call convention).

That’s what I’m thinking: we should define these modifiers such that a hypothetical future compiler has room to overrule them.

Without commenting on the larger point, isn’t this exactly what @usableFromInline does?

There are many situations where the compiler could be able to switch between the two, most notably inlining and specialization.

As for module boundaries, cross-module optimization (which may or may not be stable right now, it’s frustratingly unclear) allows the compiler to ignore them entirely in favor of optimizing everything it can. Ideally, the compiler should be able to produce a binary that is literally impossible to improve upon (Pareto optimal) at that point. You know, eventually.

That sounds like we're putting the cart before the horse. To allow the caller to decide the calling convention even for libraries in binary forms, we would require a certain level of dynamism. I don't think these small ARC optimization would outweigh the cost of such dynamism.

Yes, but that just pushes these boundaries a little deeper into the module, not eliminating them. Compilers can see @inlinable functions and optimize ARC around them, but it still needs to follow the calling convention around @usableFromInline functions, which would only be available in binary forms.

If you want to eliminate such boundaries, you'd need to distribute the library as source code, which is not ideal, or even possible for many cases.

Lest we forget dynamic linking, where the libraries are compiled prior to the application(s) using them, and we can't change the compiled libraries.

I think we're getting off-topic, or rather, out of scope. I'm not exactly sure :thinking:.

In most scenarios the compiler probably couldn’t be able to tell which would be better, most obviously when the caller isn’t being compiled at the same time as the callee. I’m merely pointing out that there are many scenarios where it could, and we should avoid a scenario where using either of these modifiers results in a worse outcome.

This is how @inlinable works: the compiler may or may not inline code with or without that attribute. In fact, CMO ignores it entirely! Code may be inlined in some places but not others, based on whether binary size or runtime performance is being emphasized, etc. There’s a lot of variables, and most of them are complete unknowns when the code is written.

That’s how Swift Package Manager works, which is where I expect the bulk of the usage is going to be.

It’s from the performance roadmap.


Yes, there are definitely such cases, common even. Better yet, in some cases, such as when the callee is non-public, the compiler can even ignore the marked convention entirely (though I'm sure figuring out the optimal convention is also a rather complex process).

OTOH, these keywords are essential for where the compiler most definitely can't do such things, by design and necessity. It's not just about optimization, it's also about the interfaces that the library can craft.