Fair point. There are two opposing points that are typically made here:
-
even when users try to control references, it's easy for aliases to accidentally sneak in
-
potential side-effects should be explicit. Hence the try and await keywords
Right. A copyable variable can be "consumed" two ways
- explicitly using a
consume operator
- implicitly by declaring it
consuming and passing it to something that needs ownership
In either case, the compiler should not extend the lifetime beyond that consume.
What's debatable is whether the compiler can shorten a lifetime before the point of the consume, or on paths without a consume. Doing so allows us to combine ownership control with normal ARC optimization. On the other hand, ARC optimization is less important in a world where copies must be explicit, and it is natural to assume that _ = consume object has the same effect as withExtendedLifetime(object).
I lean toward allowing copyable lifetimes to be optimized because we can never change our mind after making them strict, and I would not want to add another dimension of performance annotation on top of the ownership controls.
We've tried that. The compiler does not know whether lifetime extension is required in any interesting cases. It's doing lifetime extension conservatively whenever it cannot prove that shortening to the last use is safe.
We get a lot of mileage out of ARC being fully deterministic given fixed inputs, including source code and compiler version. And we're adding ownership control to allow programers to have precise control so they don't need to think about what the optimizer would do. The debate here is mainly about when deinit side effects should be observable in the regular code.
The "don't let me use this again" and "deinit no later than this" use cases seem compatible. The "don't deinit earler than this" use case would also be nice, but it doesn't seem very important for copyable types, there is already a way to do that (withExtendeLifetime), and optimization can be useful for copyable types in situations where we have a normal consume that wasn't written as an explicit lifetime marker.
I would like to encourage programmers to use ownership controls and tell them they'll get strictly better performance when those controls are used right. Here, the consume operator was used perfectly, but it may have disabled optimization.
Remember, you can always wrap a reference in a non-copyable type to get strict lifetimes if that's what you're after. (There's a another lifetime debate that we need to have about whether you also need a user-defined deinit, but that will be a different thread).
You very rarely need it in practice. Only when you're abusing class deinitializers or doing some unsafe low-level programming.
That would be nice, but it's impossible. By design, there's no way to reliably analyze weak references and unsafe pointers.
Lifetime optimization can lead to observable difference whenever shared mutable state exists and there are no other barriers to optimizing the lifetime (no potential unsafe pointers or weak references into the managed object, and no calls outside of pure Swift code).
We only extend a the lifetime of a variable that holds a strong reference within the variable's lexical scope. Without that lifetime extension, weak references are effectively unusable without explicit lifetime control. The case people want to test is when some alias of the local ends up accidentally hanging around, which still works as expected.
Right. We have resisted making a guarantee that the compiler won't extend lifetimes (absent a consume as defined above), but I'm strongly biased toward the rule that deinitialization is never reordered with certain synchronization points. It provides basic debugging sanity and a reasonable level of programmer control.
I don't think this precludes memory pools though. To do that, we would allow the regular deinit path to relinquish its memory without invoking the deallocator.