`borrow` and `take` parameter ownership modifiers

I think you'd need to have an explicit take or borrow in source in order to disambiguate, because otherwise, it's up to the optimizer or dataflow analysis to decide whether the call site is a taking use, and we do that dataflow analysis well after overload resolution.

4 Likes

Maybe I'm holding it wrong, but it doesn't appear that pattern matching by itself in Rust suppresses implicit drop. If I do something like:

pub struct Butt { x: Vec<i32> }

extern "C" {
    fn drop_butt();
}

impl Drop for Butt {
    fn drop(&mut self) {
        unsafe {
            drop_butt();
        }
    }
}

pub fn butz(x: Butt) {
    match x {
        Butt { x } => {

        }
    }
}

then I get "cannot move out of type Butt, which implements the Drop trait". I think it makes sense in general to limit the ability to "forget" move-only values without running their deinits to places that are controlled by the type's own API. Although Rust marks mem::forget as safe, with the justification that destructors may not run away because of process death, Rc leaks, and such, that justification bites back when considering the relative ordering of destructors with lifetime-dependent values. Being able to forget the LockGuard from a Mutex is a good example of this; doing so will leave the mutex locked outside of the scope of the static shared access, which may in turn allow the owner of the mutex to move it while the OS still has lock bookkeeping tendrils into the mutex at its current location, causing corruption when the lock implementation tries to update bookkeeping in a moved-out-of lock value.

Oops, you are right. I thought I used this before but it must have been a Copy field (like a pointer). If Rust can get away without this maybe Swift can too, even if it means we need internal optionality in some cases we otherwise wouldn’t.

Instead of having a function with magic behavior like mem::forget, we could maybe allow for a special form of switch-ing on self inside a taking func:

switch deinit self {
case Butt(x: x):
  consume(x)
}

which would let you move out the components of self and disable its implicit deinitialization, allowing for consuming operations to move out the components and consume them without needing the original type to have an intermediate invalid state.

3 Likes

Do you mind explaining what switch means in this context? I’m assuming it’s a direct analogue to match x in the Rust example, but not being very familiar with Rust I don’t understand why that syntax makes sense for something that doesn’t appear to be an enum.

1 Like

switch in Swift and match in Rust are both general pattern-matching facilities. Although pattern matching is very commonly used with enums to switch out the different cases, you can use them with other pattern-matching supporting types too, like:

let x: Int = 100
switch x {
case 0...60: print("failed")
case 60...69: print("D")
case 70...79: print("C")
case 80...89: print("B")
case 90...99: print("A")
case 100: print("S")
default: fatalError("cheater")
}

We don't currently have struct pattern matching in Swift (though Rust does), so I made up syntax for it to be analogous to the Rust example.

I suppose the deinit behavior doesn't necessarily need to be specially tied to switch, too, it could be an analog to the explicit move/take operator we'd discussed under SE-0366 that additionally disables deinit of the structure, allowing you to forward ownership of the value's components.

How would this work in the presence of private properties? Would the be immediately discarded?

For deinit self to work at all, it seems to me that you need to be in a context that has full visibility into the underlying type layout (just like a designated initializer), so its use would be restricted to methods in the original type declaration.

2 Likes

I realise that this is a theoretical future thing, but would it be possible to perform a second round of overload resolution after that dataflow analysis (perhaps a limited form which only decided between taking/borrowing variants of a function?), or does it cross something so fundamental that there's really no chance of it happening?

I've revised the proposal PR again to include some more topics that have come up in the discussion so far:

https://github.com/apple/swift-evolution/pull/1739/commits/50ef81a2ab173ab5923ce6968c622c09b1f858aa

  • Mention that nonescaping closure arguments cannot be take-n
  • Mention "set"/"out" parameter conventions as a future direction
  • Include discussion of destructuring move-only values that normally would have destructors,
    without invoking the normal deinit

Thanks everyone for continuing to help refine this proposal.

3 Likes

Nice!

I think "Destructuring methods" makes more sense in SE-0366: Selective control of implicit copying behavior. There is a gap now between the scalar move semantics in that proposal and the future work of "Destructuring methods".

Memberwise take will work wherever the layout is available and no deinit is provided:

func swap(x: take (AnyObject, AnyObject)) -> (AnyObject, AnyObject) {
  return (take x.1, take x.0)
}

It would be pretty strange to define a deinit method just to allow memberwise take.

We don’t expose which properties are stored and which are computed outside of a type, though; particularly not across module boundaries.

Yep, that's why we have taking methods! Maybe my example was too minimalist. The point is that you should be able to destructure without adding a deinit method or using a deinit keyword.

...also, the target of this proposal is system programming where it's common to expect optimization within modules, employ frozen, build without library evolution or all of the above.

[EDIT] There is hefty intersection between this proposal and programmers who pay attention to low-level performance. The only relevant point made here is that we care about ownership features working reasonably in situations where the type's layout is known. That includes taking methods and frozen structs. @jrose has a good point that the syntax needs to work regardless of the build mode--even though the layout is know in whole-module optimization, that won't help us write member wise destructures.

1 Like

I'm still not really sold on the name take, to be honest. I'm sorry to keep harping on about terminology, but we only get one chance to ship this feature and I think how approachable it ends up being will largely be decided by how intuitively the code reads.

I had some suggestions in the previous review. Even if we decide something else, I still think it's worth brainstorming this a little more. Personally, I don't think take is an improvement over move or consume.

That aside, and just taking the proposal as it currently is, if a method takes self (sounds weird, right? What I mean is that it consumes self via the take operation... you get it), we would write that function as:

taking func foo()

But parameters are written with plain take (no 'ing'):

func foo(_: take String)

I think it would read better for parameters to also be written taking. It's still not perfect but I think it's better than plain take.

func foo(_: taking String)
func append(contentsOf: take some Sequence<Element>)
func append(contentsOf: taking some Sequence<Element>)

More broadly, the proposal mentions one motivation for using take like this is so we can use the same keyword in the function declaration and at the call-site. I'm not so sure that kind of simplicity necessarily leads to a more intuitive design.

  • As noted above, the proposal would already introduce a separate keyword for functions that consume self.

  • Function declaration are annotated throws, but errors are returned with throw (no 's') and calling a throwing function requires the keyword try at the call-site.

    Most people seem to understand that well, I think? I've never heard of anybody asking to annotate function declarations with try or to call a throwing function with throw.

  • async functions similarly require the keyword await at the call-site.

So I don't think we should necessarily be afraid of using different keywords in different contexts if it makes the overall design easier to read. I really do think that is going to determine how comfortable developers feel using this feature and how often they use it correctly.

Ownership is seriously important. It deserves as many keywords as it needs to get the right design, IMO.

Is it, though? One pattern in my own code that I have found would benefit from these annotations are lazy collection views - I like to write methods which calculate indexes as initialisers on the index type itself (i.e. "calculate the next index from this position in this source collection"). That's just how I like to write it. But since parameters to an initialiser are implicitly take, this leads to additional retain/release overheads which can be significant. In order to avoid those overheads, I would need the features being proposed here - to explicitly specify that the source is a borrow parameter and isn't being consumed.

Is that systems programming? Not really, I don't think. That's fairly regular code that you could see in any Swift app, and that's why I think it's important that these features are designed in a way that is approachable to all Swift developers.

3 Likes

Having written code at every layer of the stack from Adium down to libsystem, I agree that the differences are not as drastic as sometimes portrayed. I think it’s reasonable to hope that most code in most apps will not want to explicitly annotate these things, but that’s not the same as saying app developers won’t benefit from using it in a few places.

3 Likes

The Core Team have also clearly stated a goal of preventing the formation of new Swift dialects. Designing ownership solely based on the needs of a specific subset of Swift programs seems counter to this goal.

1 Like

i like move better for the operator and take for the parameter modifier. i think move is to take what & is to inout.

1 Like

Sure, I'm not suggesting you would use it everywhere. I think we're in agreement that, while a lot of this can hopefully be automated by ARC and the compiler's optimisation pipeline, it's not a completely specialist-only feature, and even if you don't need to take control very often, it's still good to have an understanding of how these things work.

I guess maybe I'd compare it to self-driving cars. It's nice when the computer can do everything for you, but you still need a human behind the wheel ready to intervene when things don't go so well.

I'd also like to say something about optimisation - reducing execution time is important; we see developers here quite often who want to perform audio processing in Swift, and computer vision has exploded over the last 5-7-ish years with VR, AR, and ML models which can make sense of what they see in live camera feeds. Processing data in shorter and more predictable amounts of time has huge benefits for those incredibly popular classes of applications. As more computational power becomes available, there is no shortage of people ready to use it.

But it's more than that, too - it's also about power consumption. I sometimes speak with developers who think that, because they don't see reports of performance problems, they have little/nothing to gain from profiling and looking for ways to optimise their code. But even as hardware improves, and even if your application performs well - fits within its frame-time budget, never stutters, etc - if you can reduce its power consumption, it's a better app.

Probably most people on this forum already know this, but I'd just like to emphasise it.

At least IMO, optimisation is not a niche feature.

2 Likes

I've never understood "no dialects" to imply "don't add features which are more applicable to particular problem domains." We should of course try to design features (even those that are intended for niche problem domains) with an eye towards broader use cases, and not artificially restrict their applicability, but IMO "no dialects" should not be taken to mean that we don't add features for more narrow use cases.

4 Likes

I would think @dynamicCallable is testament to that. It was intended for easier inter-op with Python (and other scripting languages), but has found wider use.

2 Likes